Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric certified for FREE! Don't miss your chance! Learn more
Hi,
Me and my colleague have noticed an issue with a few pipelines recently.
For context the pipelines architecture is made up of 4 dataflows in a linear flow, the first two extract data from a postgresql database and send it to a bronze lakehouse.
Dataflow 3 and dataflow 4 move the data from bronze to silver and silver to gold respectively.
What we are seeing is that the initial extracts run fine i.e. dataflows 1 and 2, but when the 3rd one runs its like its viewing a cached version of the lakehouse which does not contain the newly populated data.
We have tried setting waits inbetween the dataflows which has not worked.
We have also tried removing dataflows 3 and 4 from the pipeline and scheduling them to run directly in their settings which also did not work.
Is there a current work around for this?
Solved! Go to Solution.
Hi @Jester_3
Insert a lightweight Notebook that:
Kudos done
What you’re hitting is the lag between physical writes and metadata visibility in Fabric’s SQL analytics endpoint. Dataflow 3 is likely reading stale metadata even though the Bronze‑to‑Silver write has completed.
Your commit‑check workaround is solid. A few refinements worth noting for others:
- Poll Delta version rather than just the date, so multiple commits in a day don’t collide.
- Tune timeouts — most syncs finish in minutes, so shorter waits with retries can reduce pipeline latency.
In short, explicit synchronization is the safest way to ensure downstream dataflows operate on the latest state until Fabric improves metadata propagation.
Hi @Jester_3
Insert a lightweight Notebook that:
Thanks for this suggested fix, I think I'll have to apply this to all my pipelines currently as this issue is happening more requently this past week.
The long description of the fix may be beneficial for others so i've went into detail below.
I ended up adding an until loop with a timeout of an hour in my pipeline:
The conditional check on the pipeline was :
@equals(
activity('DELTA_CHECK').output.result.exitValue,
formatDateTime(utcNow(), 'yyyy-MM-dd')
)
The notebook content :
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 24 | |
| 4 | |
| 3 | |
| 3 | |
| 2 |
| User | Count |
|---|---|
| 59 | |
| 13 | |
| 10 | |
| 7 | |
| 7 |