Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
I saw some references to a maximum of a million record issue related to the topic, but two weeks ago, I was able to import like 30 million records from our database and the Excel files referred did not change.
Manually refreshes on a selection of tables works, refreshing those tables that fail one by one does work, do it all at once results in a failure. Got back to the November version, still a problem. Different computer, still a problem. Removing the million+ tables from refresh and put a "remove blank rows" on all Excel files, still a problem.
What am I missing?
Solved! Go to Solution.
Turned out that one of the Excel files I receive from another contained the maximum number of columns possible, even though they are not in use. "Remove other columns" turned out to be the solution.
Turned out that one of the Excel files I receive from another contained the maximum number of columns possible, even though they are not in use. "Remove other columns" turned out to be the solution.
Hi @DouweMeer ,
There is no volume limitation for a load for either DirectQuery or Import. DirectQuery has a limit of rows behind each visual - 1 mln, but Import has no such limit. You have a limit for Import mode - memory usage limited to your computer resources (RAM). So if you have 8 GB of RAM and your report consumes 9 GB then it will fail. You can check usage via Windows Task Manager -> Resource Monitor (https://www.digitalcitizen.life/how-use-resource-monitor-windows-7)
Best Regards
Lucien
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the October 2025 Power BI update to learn about new features.