Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi,
we are currently using Gen2 premium capacity (P3) and our dataflow that is sourcing from on-prem SQL server fails once it hits the 20 minute mark. Before 20 minutes it will finish, but at the 20 minute mark it fails consistently.
With Gen2 we no longer have the ability to configure Dataflow memory and container size, so not sure where else to check any settings that can be configured to make it stop timing out.
thanks
Hi @scabral ,
Could you please provide a screenshot with the refresh error? what is your data source? Was there any error reported before the refresh?
“Your(…) dataflow couldn’t be refreshed because there was a problem with one or more entities, or because dataflow capabilities were unavailable.” Is this the error that was reported?
Refreshing a dataflow will have either one of the following symptoms:
More details: Known issue - Long running, failed or stuck dataflow in Premium Gen2
If I have misunderstood your meaning, please provide more details.
Best Regards
Community Support Team _ Polly
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!