Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hi community!
I have two different dataflows, the first one I connect to the source (Business Central), transform some columns to text and then have a data destination to dump it all in a lakehouse table named bronze_glEntries (code down below)
The other dataflow gen 2 gets the data from bronze_glEntries and then change some columns to integer and then gets dupmed in another lakehouse table named silver_glEntries.
The destination is have the option "replace" which means that the silver_glEntries previus data will be deleted and get replaced with the new data, but that isn't the case, the data just don't update and i don't know why.
How I know is cause I did a notebook to count all the rows:
Does anybody know how to fix it or what the problem is?
Thanks in advance
Solved! Go to Solution.
After reading the link you sent it seems that there is a delay to transfer data, so I implemented the Wait activity and put the wait to 3 minutes and it seems to work. So when bronze is done it'll wait 3 min before executing the silver dataflow gen 2 and it seems to work.
Thanks for your help, very appriciated!
Are the two Dataflows run on a scheduled refresh?
Have you checked that the first Dataflow has finished refreshing before the second Dataflow starts refreshing?
Or are they run inside a Data Pipeline? If so, how are the activities connected?
I'm trying both to see what happens.
The refresh is after one another with 30 min margin, and the bronze DF takes about 10 min so it should be fine but sometimes it still doesn't work as I said earlier.
The pipeline is a couple of hours later just to see if it's also working and connected so when bronze is done do the silver one, it worked yesterday so will see later on today if it's the same result aka working.
If it still doesn't work, you can also consider to create a support ticket: https://support.fabric.microsoft.com/en/support/
One theory:
Your second dataflow is using Lakehouse.Contents() M function (you can verify this in Advanced Editor in your dataflow) to get the content from the Lakehouse.
The Lakehouse.Contents() function uses the SQL Analytics Endpoint of the Lakehouse. At least this is true when using Lakehouse.Contents() in Power BI Desktop. I'm guessing it's the same also when using Lakehouse.Contents() in Dataflows Gen2.
So if there is a long delay to sync data between your Lakehouse and SQL Analytics Endpoint, this might explain the behaviour you're seeing, because your second dataflow would be querying the SQL Analytics Endpoint.
https://www.reddit.com/r/MicrosoftFabric/s/leYLOX5uRK
https://www.reddit.com/r/MicrosoftFabric/s/nexpMsKFfx
https://learn.microsoft.com/en-us/fabric/data-warehouse/sql-analytics-endpoint-performance
To verify whether the Dataflow is actually querying the SQL Analytics Endpoint, you could go to the SQL Analytics Endpoint -> queryinsights -> exec_requests_history, and verify if you can find that your Dataflow queries appear here.
After reading the link you sent it seems that there is a delay to transfer data, so I implemented the Wait activity and put the wait to 3 minutes and it seems to work. So when bronze is done it'll wait 3 min before executing the silver dataflow gen 2 and it seems to work.
Thanks for your help, very appriciated!
Thanks very much for that!
It seems like more people are saying the same thing, so I'll try to set at wait timer between the dataflows and see what the sweetspot is for it to transfer from bronze to silver and take all the rows with it.
Do you get the same result if you query the SQL Analytics Endpoint in silver lakehouse, or if you query the silver lakehouse from Power BI desktop (use "let Source = Lakehouse.Contents() in Source" in Advanced Editor in Power BI Desktop).
Or if you connect a Dataflow to the silver lakehouse and count the rows in the table.
Just for clarification it's the same lakehouse, just that i named the table silver_*tablename* :).
I get the same result in notebook and sql query. The dataflow gives me no error, it all seems to work. But it kinda works 50/50, because sometimes when I refresh the dataflow manually it works and sometimes it doesn't.
I tried to wait to se if it's a delay or but it seems that it's only working every now and then and that it's very wierd.
When it doesn't work: is the data then unchanged in the silver table after refreshing the dataflow? Meaning it is still exactly the same old data there as before the dataflow run?
When it works: does it sometimes work after it has not worked? So sometimes it is not working and you just see old data, and then sometimes it works and then you get the new data?
Could you run this code in a Notebook cell:
%%sql
DESCRIBE HISTORY Prod_LH.silver_glEntries;
Then you get a list of all the versions of the table. Then you could check if it creates a new version at the time of the "failed" dataflow runs, or if nothing happens to the table at the time of the "failed" dataflow runs.
Thanks for your answer!
Well the operation column after I typed that is saying update on every row, so I guess it's updating the table but sometimes just don't take all the new rows from bronze to silver.
It's just wierd that the schedule for the dataflow just works sometimes, for instance yesterday it worked but today it only updated the bronze and not the silver table, but after a manually refresh I had the same total of rows.
On top of that the dataflow for silver didn't say fail, just a regular pass.