Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
Hello,
I’m attempting to load CSV files from a directory and would like to add an index to each of them, which would allow me to track error coordinates in the source data.
After reading the directory, I tried the following:
= Table.TransformColumns(Source, {"Content", each Table.AddIndexColumn(Csv.Document(_), "Index", 1, 1)})This does add an index, but it seems that ‘Csv.Document(_)’ converts the binary form into a list with comma separators and places each line into a single ‘cell’. This transformation results in numerous errors in the data.
Is there a method to index these CSV files while maintaining their binary form?
Thanks a lot,
Jarek
Solved! Go to Solution.
Is there a method to index these CSV files while maintaining their binary form?
No. At the very minimum you need to split the binary by the row delimiter (for example LF), inject the index column, and then reassemble the binary. Although at that point you might as well fully decode the files and load them into Power BI.
Is there a method to index these CSV files while maintaining their binary form?
No. At the very minimum you need to split the binary by the row delimiter (for example LF), inject the index column, and then reassemble the binary. Although at that point you might as well fully decode the files and load them into Power BI.