Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
Joorge_C
Resolver II
Resolver II

Loading a .Txt Delimited source, the max Rows hit a 244,265 limit, when the file has around 330K.

Hi All,

So loading a .txt source in PBI with the only delimitation piece, no further transformation done to it, only gets me 244,265 rows instead of the full 330K within the file. Tried loading it differently and in different PBI and I get the same.

 

Running PBI 64bits... Any clues on what could be the issue?

Thanks!

4 REPLIES 4
edhans
Super User
Super User

There is no row limit, so that isn't the issue. Can you open the file in a text editor like Notepad++ and go to row 244,266 and see if there is something there that is throwing PQ off? Non-ASCII characters or some garbage?

 

Other stuff to try:

  • In Power Query load what you can, then on the home tab for rows, click the dropdown and "keep errors." Anything showing up
  • In Power Query load what you can, then filter out all you see. Is there a field (like year perhaps) that would let you deselect everything and see if any errors show up?
  • if you have a "change type" in your query, remove it. Just load the data. This will possibly prevent errors. For example, if you have a field set to a date, but your text file has text in that place on row 244,266 it will let it in. Then go to the home tab, Remove Rows button, and remove the top 244,200 rows. Then start scanning through the rest and see what jumps out at you.

If you want to confirm there is nothing wrong with your system, which I highly doubt, there is a 1.5M record CSV file here to play with. It loads fine for me. I really think it is some corruption or invalid data type in your original CSV file though.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Thanks for your help.

 

I tried to load it in Notepad++ but its too big to open, got an error. I also looked at the data in Excel, where I was able to import and see the rows past the 244K, didnt spot anything funky on them. Also checked the errors in PBI and nothing was brought up. I made sure I didnt have any change types or anything playing with the data.

 

What I did just now, that resolved the issue.

-Loaded as Text Source, not as a csv.

-This loaded a single column with 333K rows as expected.

-I then used split by position to break out the details.

 

Thanks for your help!

Interesting, and clever thinking on the workround. I've never seen that happen. I would think a delimiter is missing. If you did split by position, I wonder if a space (char 32) is really a non-breaking space (char 160). Visually you cannot tell. And if splitting by position it won't matter, but if it is parsing based on using a space as a delimiter, it won't work, and may cause the issue you see.

 

Also weird that NotePad++ woudn't open it. I opened that 1.5M record fake credit card data on my machine in less than 3 seconds, but that file has very few fields, only 139MB. Your file might be larger even though it has fewer records.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Yea, really weird, the .txt file is 468mbs

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors