Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi,
We import and parse json data from azure blob.
I happened to know the limitation for data size is 10 GB and refresh timeout is 2 hours.
We are hitting this timeout issue with data source size of 5.5 GB.
I infact tried with the storage account ( which has 250 MB data) and it took around 15 min to fetch and display the report.
As per the above link, "Scheduled refresh for imported datasets timeout after two hours. If you are encountering this limit, you can consider reducing the size or complexity of your dataset, or consider breaking the dataset into smaller pieces"
1. I would like to know the optimization techniques that I can follow for my datamodel to process 10 GB BLOB data in 2 hours. Is this possible?
2. Is there any limitation on azure blob data to attain that level of performance? If so, how much blob data size is supported given timeout limit?
Hi V55,
Perhaps Incremental refresh can solve your problem, for more details, please refer to: https://docs.microsoft.com/en-us/power-bi/service-premium-incremental-refresh.
Regards,
Jimmy Tao
Hi @v-yuta-msft,
Thanks for the response.
How does incremental refresh solves the issue for the first load (which will be full load of data)?
If it times out in the first full load, then it doesn't solve the problem, right?
Let me know if I miss anything here.
Thanks.
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!