Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello together!
I am currently facing a problem with Gen2 Dataflows in Fabric. After being published, the Dataflow refreshes itself. When this happens, there is a slight chance that refreshing takes ages (loading for 5 hrs now), although it only took a couple of minutes the times before this weird behaviour occured.
This does seem like an error to me, as there is no obvious explanation (increase in complexity, renaming whatsoever) for the long loading times. Is there a way to fix it or anything i need to pay attention to that might cause this kind of behaviour? Or could it just be a bug?
The 1 table i wanna add to a lakehouse is 5 columns by approx. 25.000 rows, so not too large either. Data types have been declared for each column.
Thanks in advance for your tips.
Best,
Alex
What else is happening on that capacity?
There's a bunch of Dataflows, Gen1 and 2, semantic Models, Reports, and 2 Lakehouses in the same Workspace. A Colleague of mine also recently deployed GIT as a version management tool. Could Git have any influence on this? I thought, as Dataflows aren't supportet by GIT yet anyways, the problem with the Dataflow doesn't relate to that.
I would analyze the living daylights out of the Fabric Capacity Metrics app for that capacity. See if you can spot the biggest CU consumers.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 19 | |
| 9 | |
| 8 | |
| 7 | |
| 6 |