Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
Hi! I'm working on a connector that loads a big JSON from REST API (about 3GB with 6kk rows) using Web.Contents. I'm using PowerBI desktop, and I've tried multiple approaches, e.g.
- loading the file at once (never finishes, hours are passing and it shows as still being processed)
- loading the files in chunks (it loads a couple of chunks, then just hangs and does not proceed with next chunk)
I've also tried loading a locally saved JSON using native JSON data source in PowerBI, but same story - hours are passing and in the end in the end I'm getting error "Something went wrong - Object reference not set to an instance of an object".
If I save the same data not in JSON, but as Excel file - PowerBI loads that quickly and without any issue.
Is there any limitation on JSON size that PowerBI handles, or something that could explain this behaviour?
Hi @syrius ,
Power BI Desktop report file does not have a limitation on data amount, this size limit should more relate to your hard disk storage and system memory. (system memory used to processing and calculation DAX formulas in your report)
For large data from rest api,, we recommend you store to local file and create a batch script or plan to auto sync to the location database. (e.g: SQL, MySQL, Excel, Access) After these steps, you can simply get data from the synced database without a huge refresh/processing time.
For the error message:
"Something went wrong - Object reference not set to an instance of an object".
Here is a known issue, you could refer to Solved: power bi - Microsoft Fabric Community and follow the steps in it to troubleshoot the issue.
Best regards,
Lucy Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Like @hnguy71 I don't know what 6kk means, but 3 GB is a fairly hefty JSON/XML//CSV. I hope that you are following a star schema when preparing your data for loading into PBI because if you are trying to transform a giant data set in PowerQuery or DAX, things can get hairy fast. As a rule, your fact table(s) should be normalized and your dimension tables should be de-normalized. Of course, there are exceptions to the rule and knowing when and why to deviate from them requires Data Modeling experience and expertise that go way beyond what anyone can communicate in a forum. But properly armed, you should be able to load a fact table with billions of rows. If PBI is choking on millions, sadly, the only solution is to go back to the Data Modeling drawing board.
Hi @syrius
is 6kk = 600,000 rows? If so, Power BI is definitely powerful enough to handle that amount of data. A JSON, XML, or CSV does not have a row/data limit (or, not that I'm aware of). Are you doing any transformations on top of the data load?
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
93 | |
87 | |
84 | |
67 | |
49 |
User | Count |
---|---|
141 | |
115 | |
111 | |
59 | |
59 |