Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
TanachaiP
Frequent Visitor

Issue: Copy Activity Processes 0 Records from Warehouse Query on Lakehouse Data

I'm facing an issue with a Copy Activity. I'm using it to copy data from a Fabric Warehouse (by querying data from a Lakehouse) to an Azure SQL Database. Sometimes, the activity doesn't work correctly.

For example, it occasionally reads 0 records, but the actual query result should contain 36 records (and can sometimes be up to 10,000). This happens even though no other processes are modifying the source tables for the query. However, the issue is easily fixed by simply rerunning the pipeline.

 

My team and I are concerned because we don't know the exact reason for this recurring issue. The main problem is that the job doesn't appear as 'failed' and provides no error messages, even when it processes zero records. This requires us to manually check the results of the copy activity and then rerun the pipeline ourselves. This situation makes us hesitant to use this solution in other projects until we are confident that it can be used reliably.

Could you please help explain the potential causes of this issue?

 

By the way, we are aware that using a notebook could solve this issue. However, it would be far more agile and simpler if we could just use the standard Copy Activity.

TanachaiP_1-1751556470195.png

TanachaiP_3-1751556808572.png

 

 

1 ACCEPTED SOLUTION
v-mdharahman
Community Support
Community Support

Hi @TanachaiP,

Thanks for reaching out to the Microsoft fabric community forum.

Based on your screenshots and description, you're running a query from a Warehouse that queries data from a Lakehouse, and using that result in a Copy Activity to write to Azure SQL. The fact that the pipeline sometimes runs successfully but copies 0 rows and works perfectly when rerun which points to a known issue with the SQL Analytics Endpoint in Fabric as earlier mentioned by @smeetsh.

Basically, there’s a delay between data being ingested into the Lakehouse and that data being available through the SQL endpoint (which the Warehouse uses). This delay can be inconsistent sometimes just a few seconds, but we've observed it can stretch up to 10 minutes.

To fix this Microsoft Fabric added a Wait activity (e.g., 5–10 minutes) between the ingestion step and the Copy Activity. This gives the SQL endpoint enough time to reflect the latest Lakehouse state.  If you're not able to use a delay or want more control, you can also query the Lakehouse directly using a notebook and use DataFrame APIs to write to SQL. This approach avoids the SQL endpoint entirely and gives you stronger guarantees on data consistency though you did mention you'd prefer avoiding notebooks.

 

I would also take a moment to thank @smeetsh, for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference

 

If I misunderstand your needs or you still have problems on it, please feel free to let us know.  

Best Regards,
Hammad.

View solution in original post

4 REPLIES 4
BhaveshPatel
Community Champion
Community Champion

For that, you should use notebooks and write queries ( you should use Scala and Python Dataframe)

# Create DataFrame to insert into the Azure SQL table ( Writing Data to Azure SQL Table )
df1 = spark.createDataFrame([('WL2456', 'Luke1293', 'Skywalker','Null','Null',-3,'Unknown','Unknown','Null')], ['DeviceID', 'DeviceName', 'DeviceDesc','OperatorName','OperatorDesc','FacilityID','FacilityName','FacilityDesc','DeviceSerialNo'])

# Insert the rows into the Azure SQL table
df1.write \
    .option('user', user) \
   .option('password', pswd) \
   .jdbc('jdbc:sqlserver://' + sqlserver + ':' + port + ';database=' + database, 'dbo.Devices', mode = 'append' )
   
   
df1.show()
 
You should use this to write back to azure sql database. Just change the name of the tables..Cheers
 
 

 

 

 

 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.
v-mdharahman
Community Support
Community Support

Hi @TanachaiP,

Thanks for reaching out to the Microsoft fabric community forum.

Based on your screenshots and description, you're running a query from a Warehouse that queries data from a Lakehouse, and using that result in a Copy Activity to write to Azure SQL. The fact that the pipeline sometimes runs successfully but copies 0 rows and works perfectly when rerun which points to a known issue with the SQL Analytics Endpoint in Fabric as earlier mentioned by @smeetsh.

Basically, there’s a delay between data being ingested into the Lakehouse and that data being available through the SQL endpoint (which the Warehouse uses). This delay can be inconsistent sometimes just a few seconds, but we've observed it can stretch up to 10 minutes.

To fix this Microsoft Fabric added a Wait activity (e.g., 5–10 minutes) between the ingestion step and the Copy Activity. This gives the SQL endpoint enough time to reflect the latest Lakehouse state.  If you're not able to use a delay or want more control, you can also query the Lakehouse directly using a notebook and use DataFrame APIs to write to SQL. This approach avoids the SQL endpoint entirely and gives you stronger guarantees on data consistency though you did mention you'd prefer avoiding notebooks.

 

I would also take a moment to thank @smeetsh, for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real difference

 

If I misunderstand your needs or you still have problems on it, please feel free to let us know.  

Best Regards,
Hammad.

Hi @TanachaiP,

As we haven’t heard back from you, so just following up to our previous message. I'd like to confirm if you've successfully resolved this issue or if you need further help.

If yes, you are welcome to share your workaround and mark it as a solution so that other users can benefit as well. If you find a reply particularly helpful to you, you can also mark it as a solution. And if you're still looking for guidance, feel free to give us an update, we’re here for you.

 

Best Regards,

Hammad.

smeetsh
Resolver II
Resolver II

Are you using the lakehouse SQL end point to query the data?

 

If so, there is a known issue where the SQL endpoint faces a delay in reflecting the lakehouse data. We face the same problem and started building in a wait activity between the ingest step into the lakehouse and copy activity that querries the SQL endpoint. It seems to vary around 5 minutes, but we have seen longer delays as well. Our max wait is 10 minutes.

Cheers

Hans.

 

(If my answer is correct, please mark it as a solution)

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.