Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
Good morning,
I'm through a bit of trouble while trying to load a huge file into Power BI.
This file is an array of JSON objects weigthing 80+ GB in total, so I load it as a JSON object, but after some minutes while the content is being analysed, Microsoft Mashup Service is eating all of my RAM. I have followed advise for optimizing data load in other posts (disabled paralel loads, etc.) but it's still too much for Power BI.
Is there any way that Power BI uses pagination instead of eating all my RAM? Am I doing something wrong, or the simple idea of loading such huge file in Power BI is infeasible?
Thank you very much.
Solved! Go to Solution.
Based on my test, there is not such a option to set a fixed amount of memory in Power BI Desktop. In your scenario, workaound for you is import Json file to SQL Server database. And then connect to this database by using DirectlyQuery mode.
Here are some useful links for you reference.
https://blogs.msdn.microsoft.com/sqlserverstorageengine/2015/10/07/bulk-importing-json-files-into-sq...
https://community.powerbi.com/t5/Service/Direct-query-or-import/td-p/98683
Regards,
Charlie Liao
Based on my test, there is not such a option to set a fixed amount of memory in Power BI Desktop. In your scenario, workaound for you is import Json file to SQL Server database. And then connect to this database by using DirectlyQuery mode.
Here are some useful links for you reference.
https://blogs.msdn.microsoft.com/sqlserverstorageengine/2015/10/07/bulk-importing-json-files-into-sq...
https://community.powerbi.com/t5/Service/Direct-query-or-import/td-p/98683
Regards,
Charlie Liao
Check out the April 2025 Power BI update to learn about new features.
Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
User | Count |
---|---|
75 | |
73 | |
69 | |
47 | |
41 |
User | Count |
---|---|
63 | |
41 | |
30 | |
28 | |
28 |