March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
I've got a Fabric pipeline with the following 3 steps:
1) Copy task: load data from bronze lakehouse files to bronze delta table
2) Stored procedure task: load data from bronze delta table (via sql view) using CTAS to a silver warehouse table
3) Notebook task: refresh the semantic model for the report
I've noticed (and I've verified this repeatedly) that the copy task succeeds and moves on to step 2 while the bronze delta table is not yet populated (via sql endpoint query). That obviously causes downstream issues, but I've been able to get around this with a wait task. Has anyone else experienced this? Is the issue a bug with the copy task or is there a delay in the data being available via the sql endpoint table in the bronze lakehouse?
Solved! Go to Solution.
Hi @dzav ,
From what I’ve found, this issue might be related to a delay in data availability via the SQL endpoint rather than a bug in the copy task itself. You can look at this document for a further study: How to copy data using copy activity - Microsoft Fabric | Microsoft Learn
Using a wait task, as you’ve done, is a practical workaround. Another approach could be to implement a polling mechanism in your pipeline to check for data availability before proceeding to the next step. This can help ensure that the data is fully populated and ready for the subsequent tasks.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
I'm seeing the same thing in an integration scenario.
Pipeline Steps:
1) Invoke Pipeline: Multiple Copy Data activities, where each load from Amazon RDS tables to lakehouse tables
2) Copy Data: query and transform lakehouse table data via data warehouse, destination to same warehouse
3) Notebook: API POST to non-Fabric cloud platform
Step 2 is starting before all the data from Step 1 is available in the lakehouse tables. Step 1 has the "Wait on completion" setting checked on the invoke Pipeline activity, and I have confirmed that Step 2's Start time is indeed after Step 1's completion time per the Monitoring Hub.
It appears that Copy tasks writing to Lakehouse tables get marked as Completed prior to the data being available. When Copy Destination = Lakehouse Table, there should be an activity setting that does the polling built-in, rather than requiring users to add complexity to pipelines with Until activities and custom polling.
Hi @dzav ,
From what I’ve found, this issue might be related to a delay in data availability via the SQL endpoint rather than a bug in the copy task itself. You can look at this document for a further study: How to copy data using copy activity - Microsoft Fabric | Microsoft Learn
Using a wait task, as you’ve done, is a practical workaround. Another approach could be to implement a polling mechanism in your pipeline to check for data availability before proceeding to the next step. This can help ensure that the data is fully populated and ready for the subsequent tasks.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
6 | |
2 | |
2 | |
1 | |
1 |
User | Count |
---|---|
12 | |
3 | |
3 | |
3 | |
2 |