Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
Anonymous
Not applicable

Dataflow gen 2 doesn't transfer data between lakehouse tables?

Hi community!

I have two different dataflows, the first one I connect to the source (Business Central), transform some columns to text and then have a data destination to dump it all in a lakehouse table named bronze_glEntries (code down below)

weetom_0-1723186036878.png


The other dataflow gen 2 gets the data from bronze_glEntries and then change some columns to integer and then gets dupmed in another lakehouse table named silver_glEntries.
The destination is have the option "replace" which means that the silver_glEntries previus data will be deleted and get replaced with the new data, but that isn't the case, the data just don't update and i don't know why.

weetom_1-1723186237310.png



How I know is cause I did a notebook to count all the rows:

weetom_2-1723186434297.png

 

Does anybody know how to fix it or what the problem is?
Thanks in advance

1 ACCEPTED SOLUTION
Anonymous
Not applicable

weetom_0-1723703881990.png

After reading the link you sent it seems that there is a delay to transfer data, so I implemented the Wait activity and put the wait to 3 minutes and it seems to work. So when bronze is done it'll wait 3 min before executing the silver dataflow gen 2 and it seems to work.

Thanks for your help, very appriciated!

 

View solution in original post

10 REPLIES 10
frithjof_v
Super User
Super User

Are the two Dataflows run on a scheduled refresh?

 

Have you checked that the first Dataflow has finished refreshing before the second Dataflow starts refreshing?

 

Or are they run inside a Data Pipeline? If so, how are the activities connected?

Anonymous
Not applicable

I'm trying both to see what happens.
The refresh is after one another with 30 min margin, and the bronze DF takes about 10 min so it should be fine but sometimes it still doesn't work as I said earlier.

The pipeline is a couple of hours later just to see if it's also working and connected so when bronze is done do the silver one, it worked yesterday so will see later on today if it's the same result aka working.

If it still doesn't work, you can also consider to create a support ticket: https://support.fabric.microsoft.com/en/support/

One theory:

 

Your second dataflow is using Lakehouse.Contents() M function (you can verify this in Advanced Editor in your dataflow) to get the content from the Lakehouse.

 

The Lakehouse.Contents() function uses the SQL Analytics Endpoint of the Lakehouse. At least this is true when using Lakehouse.Contents() in Power BI Desktop. I'm guessing it's the same also when using Lakehouse.Contents() in Dataflows Gen2.

 

So if there is a long delay to sync data between your Lakehouse and SQL Analytics Endpoint, this might explain the behaviour you're seeing, because your second dataflow would be querying the SQL Analytics Endpoint.

 

https://www.reddit.com/r/MicrosoftFabric/s/leYLOX5uRK

 

https://www.reddit.com/r/MicrosoftFabric/s/nexpMsKFfx

 

https://learn.microsoft.com/en-us/fabric/data-warehouse/sql-analytics-endpoint-performance

 

To verify whether the Dataflow is actually querying the SQL Analytics Endpoint, you could go to the SQL Analytics Endpoint -> queryinsights -> exec_requests_history, and verify if you can find that your Dataflow queries appear here.

Anonymous
Not applicable

weetom_0-1723703881990.png

After reading the link you sent it seems that there is a delay to transfer data, so I implemented the Wait activity and put the wait to 3 minutes and it seems to work. So when bronze is done it'll wait 3 min before executing the silver dataflow gen 2 and it seems to work.

Thanks for your help, very appriciated!

 

Anonymous
Not applicable

Thanks very much for that!
It seems like more people are saying the same thing, so I'll try to set at wait timer between the dataflows and see what the sweetspot is for it to transfer from bronze to silver and take all the rows with it.

frithjof_v
Super User
Super User

Do you get the same result if you query the SQL Analytics Endpoint in silver lakehouse, or if you query the silver lakehouse from Power BI desktop (use "let Source = Lakehouse.Contents() in Source" in Advanced Editor in Power BI Desktop).

 

Or if you connect a Dataflow to the silver lakehouse and count the rows in the table.

Anonymous
Not applicable

Just for clarification it's the same lakehouse, just that i named the table silver_*tablename* :).

I get the same result in notebook and sql query. The dataflow gives me no error, it all seems to work. But it kinda works 50/50, because sometimes when I refresh the dataflow manually it works and sometimes it doesn't.

I tried to wait to se if it's a delay or but it seems that it's only working every now and then and that it's very wierd.

When it doesn't work: is the data then unchanged in the silver table after refreshing the dataflow? Meaning it is still exactly the same old data there as before the dataflow run?

 

When it works: does it sometimes work after it has not worked? So sometimes it is not working and you just see old data, and then sometimes it works and then you get the new data?

 

 

Could you run this code in a Notebook cell:

 

%%sql

DESCRIBE HISTORY Prod_LH.silver_glEntries;

 

Then you get a list of all the versions of the table. Then you could check if it creates a new version at the time of the "failed" dataflow runs, or if nothing happens to the table at the time of the "failed" dataflow runs.

Anonymous
Not applicable

Thanks for your answer! 
Well the operation column after I typed that is saying update on every row, so I guess it's updating the table but sometimes just don't take all the new rows from bronze to silver.

It's just wierd that the schedule for the dataflow just works sometimes, for instance yesterday it worked but today it only updated the bronze and not the silver table, but after a manually refresh I had the same total of rows. 

On top of that the dataflow for silver didn't say fail, just a regular pass.

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors