Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
I have made an incremental refresh report. There are many tables loaded in this report. some tables has a large amount of data and I set it as incremental.
I set it to retain 10 year's data, but actually the data is only from 2022/1/1 now. I set it as 10 years to avoid the data be archived.
When I refresh all tables locally in the PBI file, it only took several minutes. But when I publish it to the PBI Service and did an incremental refresh after a full refresh. It will take several hours. and sometimes it will be failed due to The operation was throttled by Power BI because of insufficient memory.
is there anyone who has the same experience and how to resolve it. Or is there any direction I can check?
Solved! Go to Solution.
Hi @JosieGuo ,
Thanks for @Tutu_in_YYC and @lbendlin reply, that's really helpful.
@JosieGuo As a supplement, a refresh beyond 2 hours will cause a refresh failure in a non-premium capacity. I recommend you buy premium capacity to increase refresh time limit to 5 hours.
If your report is already in premium capacity, you might consider using XMLA endpoint to create more advanced partition and refresh scenarios to improve refresh performance. With Premium capacities, refresh operations performed through the XMLA endpoint have no time limit.
Please refer to the following documents:
Best regards,
Mengmeng Li
Hi @JosieGuo ,
Thanks for @Tutu_in_YYC and @lbendlin reply, that's really helpful.
@JosieGuo As a supplement, a refresh beyond 2 hours will cause a refresh failure in a non-premium capacity. I recommend you buy premium capacity to increase refresh time limit to 5 hours.
If your report is already in premium capacity, you might consider using XMLA endpoint to create more advanced partition and refresh scenarios to improve refresh performance. With Premium capacities, refresh operations performed through the XMLA endpoint have no time limit.
Please refer to the following documents:
Best regards,
Mengmeng Li
Buckle up, this will be a wild ride. First things first:
- what is your data source and does it support query folding?
- I assume that you are aware that each time you publish from the Desktop to the workspace the incremental refresh partitions are gone, and will only be recreated on the next refresh
- how many rows in each partition?
- have you used SSMS or DAX Studio to validate that the partitions have been created properly and filled properly?
1. the data source is azure synapse analytics SQL
2. I think it support query folding, and I check there is no warning message about non-foldling when I set the incremental refresh settings and it can show native sql query in transform page.
3. yes, When I publish the reports again, the historical data I already refreshed before will be cleared. Is there any way to don't clear the historical data?
4. I have about 10 tables in my reports, and the max one is 500,000 rows for one day.
5. Yes, I have checked in SSMS
Two things we can look into to start:
1. Using SSMS, connect to the semantic model and check if the incremental partitions are indeed being created
2. "Insufficient memory" usually means you probably do not have the right license (ie capacity) to perform the operations. Are you on Fabric/Premium capacity or just Pro/Free license?
the license is Premium Per User.
Can you provide a screenshot of the incremental refresh settings?
3. Use ALM Toolkit to selectively sync the changed meta data without partition reset.
Why do you need incremental refresh for that data source?
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 56 | |
| 55 | |
| 31 | |
| 18 | |
| 14 |