Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by watching the DP-600 session on-demand now through April 28th.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
Hi,
we are currently using Gen2 premium capacity (P3) and our dataflow that is sourcing from on-prem SQL server fails once it hits the 20 minute mark. Before 20 minutes it will finish, but at the 20 minute mark it fails consistently.
With Gen2 we no longer have the ability to configure Dataflow memory and container size, so not sure where else to check any settings that can be configured to make it stop timing out.
thanks
Hi @scabral ,
Could you please provide a screenshot with the refresh error? what is your data source? Was there any error reported before the refresh?
“Your(…) dataflow couldn’t be refreshed because there was a problem with one or more entities, or because dataflow capabilities were unavailable.” Is this the error that was reported?
Refreshing a dataflow will have either one of the following symptoms:
More details: Known issue - Long running, failed or stuck dataflow in Premium Gen2
If I have misunderstood your meaning, please provide more details.
Best Regards
Community Support Team _ Polly
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Check out the April 2026 Power BI update to learn about new features.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
| User | Count |
|---|---|
| 9 | |
| 8 | |
| 8 | |
| 8 | |
| 7 |
| User | Count |
|---|---|
| 38 | |
| 30 | |
| 26 | |
| 24 | |
| 19 |