Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
qmest
Frequent Visitor

Optimizing data loading process from APIs

Hi,

 

I have a report that loads about hundred tables from APIs. This process is very slow and i have used most optimizations that i can think of but i'm still struggling to load everything in a usable amount of time. The tables get loaded and then combined into a fact table with about 16 million records.

 

Is it possible to split or load these 16 million records into different tables and scheduled them at different times? If not, is there anything else i can try?

 

Thank you.

 

 

3 REPLIES 3
Anonymous
Not applicable

Hi @qmest ,

I think the first thing you can try is to load the data in batches using paging. This can allow you to fetch a portion of the records at a time, it can also reduce memory usage and improve performance.
Secondly you can adjust the data that is accessed by the cache so that you can avoid redundant API calls. Of course optimizing the database settings can also handle large amounts of data more efficiently.

Finally I think you can also consider using cloud-based services that provide tools for efficiently processing large data sets.

 

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

AnalyticPulse
Solution Sage
Solution Sage

hi @qmest  hi @ you can use the incremental load, data partitioning, data preprocessing ,and transforming the data outside the powerbi envirment to optimise your report.

you can also use the direct wuery mode.

 

Learn Power BI free:

https://analyticpulse.blogspot.com

Power BI : getting started

Dax functions

powerbi Visualisation

AnalyticPulse_0-1715829624787.png

 

SaiTejaTalasila
Super User
Super User

Hi,

 

I'm not sure whether it is the right approach or not but you can try it.

You can try pulling your data from your rest API source to datalake and store it in parquet format and you can build on it.For doing this you try building pyspark or scala program.

 

Please refer this for more details -https://medium.com/@senior.eduardo92/rest-api-data-ingestion-with-pyspark-5c9c9ce89c9f

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.