Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi everyone,
I’m encountering an issue with a pipeline I’ve created in Microsoft Fabric. The pipeline runs successfully and includes the following steps:
The problem is that while the pipeline runs successfully, the data does not appear to be appended to the warehouse table. However, if I run the dataflow individually afterward, the data does get appended.
What could be causing this issue?
Thanks in advance for your help!
Solved! Go to Solution.
Hello @ronan_b6
The dataflow might execute before the upstream notebook completes writing cleaned data to the lakehouse, resulting in an empty or outdated source
In the dataflow’s warehouse destination settings, ensure Append is explicitly selected (not defaulting to Replace).
2. Avoid using “Auto create table,” which can reset configurations.
Rebuild the warehouse connection or clear cached credentials.
• Add a 1–2 minute delay after the notebook to allow metadata synchronization.
Please see if some of the suggestions are helpful
Thank you @nilendraFabric
I added a minute Wait activity after the notebook, and the the Dataflow appends to the warehouse when the pipeline runs.
Hi @ronan_b6 ,
Thank you for reaching out to the Microsoft Fabric Community. Special thanks to @nilendraFabric , for the valuable insights. Their solution effectively addresses the root cause of the issue the dataflow executing before the notebook fully writes the cleaned data to the Lakehouse.
To apply the fix, follow these steps.
If you need any additional information, please let us know. If the issue is resolved, kindly mark it As Accepted. This helps others find the solution more easily.
Hello @ronan_b6
The dataflow might execute before the upstream notebook completes writing cleaned data to the lakehouse, resulting in an empty or outdated source
In the dataflow’s warehouse destination settings, ensure Append is explicitly selected (not defaulting to Replace).
2. Avoid using “Auto create table,” which can reset configurations.
Rebuild the warehouse connection or clear cached credentials.
• Add a 1–2 minute delay after the notebook to allow metadata synchronization.
Please see if some of the suggestions are helpful