Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
Stefanve
Frequent Visitor

Dataflow Gen2 not reading latest data after Copy Job

We are experiencing an identical issue with two clients where a Dataflow Gen2 in Fabric (Silver layer) does not copy recent data from the Bronze layer during the execution of a nightly pipeline, even though the preceding Copy job has successfully copied the recent data to the bronze layer.

 

The problem:

  • New records written to Bronze by the Copy job are not visible to the Silver Dataflow during the pipeline run.
  • Even with a wait delay of 20 minutes between Bronze and Silver, new records are not visible in the Silver dataflow.
  • In the morning, when we open the dataflow, the Preview does show the recent data. After a manual refresh, the new records are processed.
  • The dataflow contains no filter transformations and refers to the correct Lakehouse tables.

Any assistance, insights, or suggestions to help resolve this issue would be greatly appreciated.  

1 ACCEPTED SOLUTION
Gpop13
Advocate IV
Advocate IV

Hi @Stefanve - Have you tried triggering the SQL endpoint refresh before the bronze to silver dataflow execution?

 

https://blog.fabric.microsoft.com/en-us/blog/refresh-sql-analytics-endpoint-metadata-rest-api-now-in...

I had a similar issue (instead of dataflow gen2 I had notebooks), where I added an extra activity between bronze and silver in the pipeline, to call a notebook with this API that triggers the SQL Endpoint refresh of the source lakehouse.

 

https://github.com/microsoft/fabric-toolbox/blob/main/samples/notebook-refresh-tables-in-sql-endpoin...

View solution in original post

3 REPLIES 3
Stefanve
Frequent Visitor

Thank you for your answers,

@tayloramy The Dataflow Gen2 source is connected to the bronze lakehouse and not to the SQL endpoint.

@Gpop13 Thanks for the links; I will test the refresh of the SQL endpoint. However, I still don’t understand why this is necessary since I’m not using the SQL endpoint.

The interesting part is that, as a test, I added a notebook with a query to check whether the notebook would pick up the latest changes. This step is placed before the 20-minute delay, and now, on our latest refresh the dataflow did picked up the most recent data.

tayloramy
Community Champion
Community Champion

Hi @Stefanve

 

@Gpop13 is on the right track. Are you using the SQL endpoint in the dataflow? 

THe SQL Endpoints can take a few minutes to sync themselves, so you should force a metadata sync after loading data via the copy job, and before reading data from the dataflow to ensure that everything is up to date. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution. 

Gpop13
Advocate IV
Advocate IV

Hi @Stefanve - Have you tried triggering the SQL endpoint refresh before the bronze to silver dataflow execution?

 

https://blog.fabric.microsoft.com/en-us/blog/refresh-sql-analytics-endpoint-metadata-rest-api-now-in...

I had a similar issue (instead of dataflow gen2 I had notebooks), where I added an extra activity between bronze and silver in the pipeline, to call a notebook with this API that triggers the SQL Endpoint refresh of the source lakehouse.

 

https://github.com/microsoft/fabric-toolbox/blob/main/samples/notebook-refresh-tables-in-sql-endpoin...

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors