Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi
I am getting the a Cache Entry Size limt Error in our Dataflow and now our dataflow is no longer refreshing?
Error: PipelineException: The evaluation reached the allowed cache entry size limit. Try increasing the allowed cache size. . RootActivityId = 03b19eb9-3093-4c92-8e40-1aa8b7c2c4ba.Param1 = PipelineException: The evaluation reached the allowed cache entry size limit. Try increasing the allowed cache size. Request ID: 9c148cff-f02c-daf3-cd60-4f7b1e42b684.
The table its self its a Onedrive Excel size 406mb with over 6M rows.
How can we resolve this?
Thanks
Viral
Solved! Go to Solution.
Hi @viralpatel21 ,
Similar error icm :308667970
feedback from PG team:
”In this case the customer seems to be using odbc connector and connecting via ODBC DSN, this connector doesn't seem to support any options related to concurrency.
They can try the step: Increase the Windows page file size.“
Best Regards
Lucien
Hi @viralpatel21 ,
Similar error icm :308667970
feedback from PG team:
”In this case the customer seems to be using odbc connector and connecting via ODBC DSN, this connector doesn't seem to support any options related to concurrency.
They can try the step: Increase the Windows page file size.“
Best Regards
Lucien
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!