Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
rafal_c
Frequent Visitor

Querying a table in Lakehouse from Warehouse returns 0 rows

Hello,

We are loading some data into Lakehouse using data copy in pipelines with Overwrite option. Sometimes when we want to query that table from Warehouse right after data copy is finished, it returns 0 rows. When we check the same table from Lakehouse view it does show the data. Is there any caching involved or something like that? How to prevent such situation from happening? We need to be able to quickly pull the incremental load from data source into Lakehouse and then use that data for some transformations in Warehouse. Everything connected together in the pipeline.

 

Thanks

1 ACCEPTED SOLUTION
AndyDDC
Super User
Super User

Hi @rafal_c you can use an API to trigger the refresh of the metadata sync between the Lakehouse and SQL Endpoint.  How-to is in this video Fix SQL Analytics Endpoint Sync Issues in Microsoft Fabric – Data Not Showing? Here's the Solution!

 

 

View solution in original post

7 REPLIES 7
AndyDDC
Super User
Super User

Hi @rafal_c you can use an API to trigger the refresh of the metadata sync between the Lakehouse and SQL Endpoint.  How-to is in this video Fix SQL Analytics Endpoint Sync Issues in Microsoft Fabric – Data Not Showing? Here's the Solution!

 

 

Thanks! I found exactly the same article yesterday. I'm going to test it right now. 

Hi @rafal_c , thank you for reaching out to the Microsoft Fabric Community Forum.


I checked the answer provided by @AndyDDC and agree with it. Can you please confirm if you tried it out and if it worked. If it didn't work, please share the details so that we can work together to solve it.

If it did work, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily.

Yes, I can confirm it does work. My tasks were executed multiple times for a few days and no issues so far. Data is available right after sync job is completed. Thank you

nilendraFabric
Community Champion
Community Champion

hi @rafal_c 

There can be a brief delay after your pipeline finishes writing to the Lakehouse before the Warehouse recognizes the newly written files. This situation often happens when a new table version or fresh delta files are created, yet the Warehouse query still sees a previous state of metadata. There is also a caching mechanism in Fabric Warehouses that holds onto data and metadata for a short time


Insert a short wait or a separate activity right after your data copy completes and before your Warehouse query runs. This allows Fabric to register the recent Lakehouse files so you’re not querying stale metadata

If feasible, drop and recreate the Warehouse table (pointing to the Lakehouse Delta location) or run a refresh step between loads. This forces the Warehouse to scan the new data files, bypassing any stale cached metadata.


If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

nilendraFabric
Community Champion
Community Champion

@rafal_c There can be a brief delay after your pipeline finishes writing to the Lakehouse before the Warehouse recognizes the newly written files. This situation often happens when a new table version or fresh delta files are created, yet the Warehouse query still sees a previous state of metadata. There is also a caching mechanism in Fabric Warehouses that holds onto data and metadata for a short time.

 

2 possible solution:


1) Insert a short wait or a separate activity right after your data copy completes and before your Warehouse query runs. This allows Fabric to register the recent Lakehouse files so you’re not querying stale metadata

2) If feasible, drop and recreate the Warehouse table (pointing to the Lakehouse Delta location) or run a refresh step between loads. This forces the Warehouse to scan the new data files, bypassing any stale cached metadata.

If this help , please accept the answer.

Thank you. I tried different things. I tried to delete the content of the tables and then vacuum + an UNTIL loop to validate if row count in Lakehouse = rowsRead from data copy task but it's taking a very long time before sql endpoint shows the right number of rows... It's not like one minute. I even have one pipeline running now and it's waiting 15 minutes and still showing 0 rows... Data copy took 7 minutes, waiting for data to be available in the lakehouse takes 2 times more, but now always, sometimes after 1 minute everything is ready. I think that deleting rows + vacuum is actually making it worse. I also tried dropping the tables and recreating them via notebook - even worse than delete + vacuum. Now I will try to trigger a refresh of metadata after each load.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.