Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi all,
I've found a bunch of people with similar queries to mine but not quite the same and I'm stuck. I currently have a query that imports data from multiple CSV files sat on Azure Blob storage. The CSV files share a common format such as the following:
| Date | Values |
| 09/12/2020 05:30 | 0 |
| 09/12/2020 05:25 | 0 |
| 09/12/2020 05:20 | 0 |
My query looks liek the following:
----------------------------
let
Source = AzureStorage.Blobs("mydata"),
#"folder1" = Source{[Name="folder"]}[Data],
AddFileContentsAsColumn = Table.AddColumn(#"folder1", "Custom", each Table.PromoteHeaders(Csv.Document([Content],[Delimiter=",", Encoding=1252]))),
#"Expanded Custom" = Table.ExpandTableColumn(AddFileContentsAsColumn, "Custom", {"Date", "Values"}, {"Date", "Values"}),
#"Removed Columns" = Table.RemoveColumns(#"Expanded Custom",{"Extension", "Date accessed", "Date modified", "Date created", "Attributes", "Content", "Folder Path"}),
#"Split Column by Position" = Table.SplitColumn(#"Removed Columns", "Name", Splitter.SplitTextByPositions({0, 4}, false), {"Name.1", "Name.2"}),
<...lots of other transformations...>
in
#"Final Name"
----------------------------
This has been working nicely until the program that creates my source files was recently upgraded and it now adds an unwanted row at the top of each CSV data file similar to the following:
| 01 - Signalling1 on Device 1 [192.168.0.254] | |
| Date | Values |
| 09/12/2020 05:30 | 0 |
| 09/12/2020 05:25 | 0 |
| 09/12/2020 05:20 | 0 |
| 09/12/2020 05:15 | 0 |
| 09/12/2020 05:10 | 0 |
| 09/12/2020 05:05 | 0 |
| 09/12/2020 05:00 | 0 |
How do I modify my query to ignore that top row? I've tried a few things to add Table.Skip to my code but withoout success.
Many thanks in advance!
Solved! Go to Solution.
Hi @GW999
Try this on the AddFileContentsAsColumn step
AddFileContentsAsColumn = Table.AddColumn(#"folder1", "Custom", each Table.PromoteHeaders(Table.Skip(Csv.Document([Content],[Delimiter=",", Encoding=1252]),1)))
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Try specifying the number of columns in the second argument of Csv.Document(). Instead of the current
[Delimiter=",", Encoding=1252]
try
[Delimiter=",", Columns = 2, Encoding=1252]
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Try specifying the number of columns in the second argument of Csv.Document(). Instead of the current
[Delimiter=",", Encoding=1252]
try
[Delimiter=",", Columns = 2, Encoding=1252]
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Works a treat! Thank you very much.
@GW999
The code also works with my test files, just recheck the csv files make sure there is no error or format problem.
Best regards
Paul
Thanks for the help everyone. I played around with the data files and I think I have worked out what is going on. I was careful to only use Notepad to open and edit files rather than Excel which will ofteen change the data. But basically, the first row in each of my data files does not have a comma at the end:
----------------------------------------------------------
01 - Teams on XXXXXXX Gateway [XXXXXXXXX]
Date,Values
14/12/2020 05:30,0
14/12/2020 05:25,0
14/12/2020 05:20,0
14/12/2020 05:15,0
14/12/2020 05:10,0
14/12/2020 05:05,0
----------------------------------------------------------
If I open the files in Notepad and add a comma manually to the end of that first row as a test then our modified Power Query works fine using the Table.Skip function. I tried putting it back in and then removing it again just to be sure . It's interesting, because I would have thought that a row is terminated by a CR/LF character rather than comma, which should only act as the field delimiter in our CSV?:
let
Source = AzureStorage.Blobs("mydata"),
#"folder1" = Source{[Name="folder"]}[Data],
AddFileContentsAsColumn = Table.AddColumn(#"folder1", "Custom", each Table.PromoteHeaders(Csv.Document([Content],[Delimiter=",", Encoding=1252]))),
#"Expanded Custom" = Table.ExpandTableColumn(AddFileContentsAsColumn, "Custom", {"Date", "Values"}, {"Date", "Values"}),
Is there any way we can remove the first row without requiring a comma delimiter at the end of the first row?
Hi @GW999
@AlB 's code works for me. have you checked all of the CSV's to make sure they don't contain null?
You've shown several images with the date 14/12/2020 and the value 0, but the null appears alongside the date 9/12/2020 - are these separate files?
Phil
Proud to be a Super User!
With that file I get the right results with the code above. Values are 0, not null. Without access to the actual file, or one with dummy data that reproduces the issue, I'm afraid I can't do more.
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Understood - thanks for the help I will play around with it and see where it's going wrong on my set.
That doesn't look like the raw file. It's already in table format. 🤔
In any case, if it loads like that with Csv.Document( ) then wat we did earlier should work fine
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Here you go, opened in Notepad instead of Excel:
01 - Teams on XXXXXXX Gateway [XXXXXXXXX],
Date,Values
14/12/2020 05:30,0
14/12/2020 05:25,0
14/12/2020 05:20,0
14/12/2020 05:15,0
14/12/2020 05:10,0
14/12/2020 05:05,0
Are you sure values are not null in the original file? You were showing all zeros in the beginning
Can you show the contents of the raw csv file (at least the first rows)
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Hi,
Yes I did think to check this just in case I was chasing my tail 🙂.
With column filter applied to show column data range:
The full range of data for this file is 0-9
Raw file, including that pesky first row:
That “#"folder1" = Source{[Name="folder"]}[Data],” step appears as ‘navigation’ in the Applied Steps list:
And the table looks like this:
Then we have:
Resulting in:
And then:
This is with the “Table.Skip” modification in place – Date is good but Values are all null:
I'd need to see the table resulting from the #"folder1" step in order to be able to see what is going on
let
Source = AzureStorage.Blobs("mydata"),
#"folder1" = Source{[Name="folder"]}[Data],
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Hi @GW999
Try this on the AddFileContentsAsColumn step
AddFileContentsAsColumn = Table.AddColumn(#"folder1", "Custom", each Table.PromoteHeaders(Table.Skip(Csv.Document([Content],[Delimiter=",", Encoding=1252]),1)))
Please mark the question solved when done and consider giving a thumbs up if posts are helpful.
Contact me privately for support with any larger-scale BI needs, tutoring, etc.
Cheers
Hi - thanks for the reply. It likes the syntax but it results in null data for all of the rows in the 'Values' column:
| Date | Values |
| 09/12/2020 05:30 | null |
| 09/12/2020 05:25 | null |
| 09/12/2020 05:20 | null |
| 09/12/2020 05:15 | null |
| 09/12/2020 05:10 | null |
| 09/12/2020 05:05 | null |
| 09/12/2020 05:00 | null |
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 19 | |
| 9 | |
| 8 | |
| 7 | |
| 6 |