Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now

Reply
syrius
New Member

Handling big JSON files in PowerBI desktop

Hi! I'm working on a connector that loads a big JSON from REST API (about 3GB with 6kk rows) using Web.Contents. I'm using PowerBI desktop, and I've tried multiple approaches, e.g.

- loading the file at once (never finishes, hours are passing and it shows as still being processed)

- loading the files in chunks (it loads a couple of chunks, then just hangs and does not proceed with next chunk)

 

I've also tried loading a locally saved JSON using native JSON data source in PowerBI, but same story - hours are passing and in the end in the end I'm getting error "Something went wrong - Object reference not set to an instance of an object".

 

If I save the same data not in JSON, but as Excel file - PowerBI loads that quickly and without any issue.

 

Is there any limitation on JSON size that PowerBI handles, or something that could explain this behaviour? 

3 REPLIES 3
v-xinc-msft
Community Support
Community Support

Hi @syrius ,

Power BI Desktop report file does not have a limitation on data amount, this size limit should more relate to your hard disk storage and system memory. (system memory used to processing and calculation DAX formulas in your report)

For large data from rest api,, we recommend you store to local file and create a batch script or plan to auto sync to the location database. (e.g: SQL, MySQL, Excel, Access) After these steps, you can simply get data from the synced database without a huge refresh/processing time.

For the error message:

 "Something went wrong - Object reference not set to an instance of an object".

Here is a known issue, you could refer to Solved: power bi - Microsoft Fabric Community and follow the steps in it to troubleshoot the issue.

Best regards,

Lucy Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

WishAskedSooner
Resolver II
Resolver II

Like @hnguy71 I don't know what 6kk means, but 3 GB is a fairly hefty JSON/XML//CSV. I hope that you are following a star schema when preparing your data for loading into PBI because if you are trying to transform a giant data set in PowerQuery or DAX, things can get hairy fast. As a rule, your fact table(s) should be normalized and your dimension tables should be de-normalized. Of course, there are exceptions to the rule and knowing when and why to deviate from them requires Data Modeling experience and expertise that go way beyond what anyone can communicate in a forum. But properly armed, you should be able to load a fact table with billions of rows. If PBI is choking on millions, sadly, the only solution is to go back to the Data Modeling drawing board.

hnguy71
Memorable Member
Memorable Member

Hi @syrius 

is 6kk = 600,000 rows? If so, Power BI is definitely powerful enough to handle that amount of data. A JSON, XML, or CSV does not have a row/data limit (or, not that I'm aware of). Are you doing any transformations on top of the data load?



Did I answer your question?
Please help by clicking the thumbs up button and mark my post as a solution!

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.

Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.