Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.

Reply
dbeavon3
Memorable Member
Memorable Member

Fastest possible access time for power platform dataflows? Gen2 DF in 2024

It is taking a minimum of about ~10 seconds to get a tiny table out of power platform dataflows.

 

In the status message it says : "waiting for powerplatformdataflows", then "waiting for lakehouse".

 

Assuming it is a table of one record, it seems very extreme to be forced to wait 10 seconds for it.  It really impacts productivity during development (inner loop) if there is 10 seconds of overhead on every single table, regardless of how small.  I can compile a very large c# application in that amount of time.  Why can't I get a single record out of azure any faster than this?

Here is a sample of the PQ.  We have premium P1.

 

dbeavon3_0-1727113809244.png

 

 

Any tips would be appreciated.

2 REPLIES 2
GilbertQ
Super User
Super User

Hi @dbeavon3 

 

While I can understand it feels like 10 seconds is a long time, you have to understand that even though you are using premium capacity, it still requires time to get the resources to then process what you need to be done.





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

@GilbertQ 

If I had told you it took 60 seconds to retrieve a single 1KB row of data, would you have believed me?    Or 600 seconds?  I think 10 seconds overhead per table is crazy, and is hard to find a way to short-circuit that.  Ideally there would be a "mock fabric" or "sync fabric" or something else that we could run locally on the desktop (something incredibly fast, instead of incredibly slow).


Do you know if there is any way to dig into these delays to isolate where they are coming from?   Another thing I noticed is that the delays are far longer (eg. an additional ~10 seconds) when refreshing the final dataset, than they are when interacting with the gen2 dataflow in the "transform/editing" window.

 

 

I'm guessing there are some workarounds, assuming I can isolate the source of the delays.  Perhaps workarounds are avaialble without even needing to leaving Fabric.

 

... But as a last resort, customers might consider sending the results of our dataflows to an independent ADLS storage account, for the sake of fast and reliable retrieval:

 

https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-azure-data-lake-stora...

 

By setting some portions of Fabric aside, I'm certain that I can get predictable performance behavior by ingesting from a normal ADLS storage account.  

 

 

 

 

 

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors