Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
qmest
Frequent Visitor

Optimizing data loading process from APIs

Hi,

 

I have a report that loads about hundred tables from APIs. This process is very slow and i have used most optimizations that i can think of but i'm still struggling to load everything in a usable amount of time. The tables get loaded and then combined into a fact table with about 16 million records.

 

Is it possible to split or load these 16 million records into different tables and scheduled them at different times? If not, is there anything else i can try?

 

Thank you.

 

 

3 REPLIES 3
Anonymous
Not applicable

Hi @qmest ,

I think the first thing you can try is to load the data in batches using paging. This can allow you to fetch a portion of the records at a time, it can also reduce memory usage and improve performance.
Secondly you can adjust the data that is accessed by the cache so that you can avoid redundant API calls. Of course optimizing the database settings can also handle large amounts of data more efficiently.

Finally I think you can also consider using cloud-based services that provide tools for efficiently processing large data sets.

 

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

AnalyticPulse
Solution Sage
Solution Sage

hi @qmest  hi @ you can use the incremental load, data partitioning, data preprocessing ,and transforming the data outside the powerbi envirment to optimise your report.

you can also use the direct wuery mode.

 

Learn Power BI free:

https://analyticpulse.blogspot.com

Power BI : getting started

Dax functions

powerbi Visualisation

AnalyticPulse_0-1715829624787.png

 

SaiTejaTalasila
Super User
Super User

Hi,

 

I'm not sure whether it is the right approach or not but you can try it.

You can try pulling your data from your rest API source to datalake and store it in parquet format and you can build on it.For doing this you try building pyspark or scala program.

 

Please refer this for more details -https://medium.com/@senior.eduardo92/rest-api-data-ingestion-with-pyspark-5c9c9ce89c9f

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Power BI Update Carousel

Power BI Community Update - March 2026

Check out the March 2026 Power BI update to learn about new features.

Top Solution Authors