Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
ilseeb
Advocate I
Advocate I

DataFlow Gen2 runs successfully but data is missing in Data Warehouse

I currently have a pipeline in which I copy data from two different sources and create lakehouse tables. Then I added a waiting period of 160seconds to be sure to leave enough time for the lakehouse to refresh, and then run a DataFlow Gen2 in which I append both tables, do some transformations and add the final table to the DataWarehouse (using Overwrite).

Recently, I have noticed that sometimes in the final table of the data warehouse all data from one of the two sources is missing, or new rows of data are missing. Everytime I have noticed this happenning, the DataFlow ran successfully and I also verify that the data is available in the lakehouse tables.

Is there something else I could monitor to make sure that my data warehouse table gets updated successfully?

Has anyone else noticed this? Could it be a bug?

1 ACCEPTED SOLUTION
FabianSchut
Solution Sage
Solution Sage

Hi ilseeb, I have noticed that sometimes 160 seconds will not be enough to update the SQL-endpoint of a lakehouse. I found a blog posts that contains a Python script to refresh the SQL-endpoint programmatically and wait for the lakehouse to be refreshed before moving on. You can find the blog post here: https://www.obvience.com/blog/fix-sql-analytics-endpoint-sync-issues-in-microsoft-fabric-data-not-sh.... You can add this in a notebook and run that after copying the data and before appending and transforming.

View solution in original post

3 REPLIES 3
FabianSchut
Solution Sage
Solution Sage

Hi ilseeb, I have noticed that sometimes 160 seconds will not be enough to update the SQL-endpoint of a lakehouse. I found a blog posts that contains a Python script to refresh the SQL-endpoint programmatically and wait for the lakehouse to be refreshed before moving on. You can find the blog post here: https://www.obvience.com/blog/fix-sql-analytics-endpoint-sync-issues-in-microsoft-fabric-data-not-sh.... You can add this in a notebook and run that after copying the data and before appending and transforming.

Thanks a lot for this link! I'll try it out

Hi @ilseeb 

 

Thank you very much FabianSchut for your prompt reply.

 

Can you tell me if your problem is solved? If yes, please accept it as solution.

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.