Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Hi All,
So loading a .txt source in PBI with the only delimitation piece, no further transformation done to it, only gets me 244,265 rows instead of the full 330K within the file. Tried loading it differently and in different PBI and I get the same.
Running PBI 64bits... Any clues on what could be the issue?
Thanks!
There is no row limit, so that isn't the issue. Can you open the file in a text editor like Notepad++ and go to row 244,266 and see if there is something there that is throwing PQ off? Non-ASCII characters or some garbage?
Other stuff to try:
If you want to confirm there is nothing wrong with your system, which I highly doubt, there is a 1.5M record CSV file here to play with. It loads fine for me. I really think it is some corruption or invalid data type in your original CSV file though.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingThanks for your help.
I tried to load it in Notepad++ but its too big to open, got an error. I also looked at the data in Excel, where I was able to import and see the rows past the 244K, didnt spot anything funky on them. Also checked the errors in PBI and nothing was brought up. I made sure I didnt have any change types or anything playing with the data.
What I did just now, that resolved the issue.
-Loaded as Text Source, not as a csv.
-This loaded a single column with 333K rows as expected.
-I then used split by position to break out the details.
Thanks for your help!
Interesting, and clever thinking on the workround. I've never seen that happen. I would think a delimiter is missing. If you did split by position, I wonder if a space (char 32) is really a non-breaking space (char 160). Visually you cannot tell. And if splitting by position it won't matter, but if it is parsing based on using a space as a delimiter, it won't work, and may cause the issue you see.
Also weird that NotePad++ woudn't open it. I opened that 1.5M record fake credit card data on my machine in less than 3 seconds, but that file has very few fields, only 139MB. Your file might be larger even though it has fewer records.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingYea, really weird, the .txt file is 468mbs
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.