Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
ronan_b6
Advocate II
Advocate II

Issue with Data Not Appending to Warehouse Table in Pipeline

 Hi everyone,

I’m encountering an issue with a pipeline I’ve created in Microsoft Fabric. The pipeline runs successfully and includes the following steps:

  1. Extract data from the source and load it into a table in the lakehouse.
  2. Use a notebook to clean and transform the data, then load the resultant data into another table in the lakehouse.
  3. Use a dataflow to append the cleaned data to a warehouse table.

The problem is that while the pipeline runs successfully, the data does not appear to be appended to the warehouse table. However, if I run the dataflow individually afterward, the data does get appended.

What could be causing this issue?

Thanks in advance for your help!

1 ACCEPTED SOLUTION
nilendraFabric
Super User
Super User

Hello @ronan_b6 

 

The dataflow might execute before the upstream notebook completes writing cleaned data to the lakehouse, resulting in an empty or outdated source

 

In the dataflow’s warehouse destination settings, ensure Append is explicitly selected (not defaulting to Replace).
2. Avoid using “Auto create table,” which can reset configurations.

Rebuild the warehouse connection or clear cached credentials.
• Add a 1–2 minute delay after the notebook to allow metadata synchronization.

Please see if some of the suggestions are helpful 

View solution in original post

3 REPLIES 3
ronan_b6
Advocate II
Advocate II

Thank you @nilendraFabric 

I added a minute Wait activity after the notebook, and the the Dataflow appends to the warehouse when the pipeline runs.

V-yubandi-msft
Community Support
Community Support

Hi @ronan_b6 ,

Thank you for reaching out to the Microsoft Fabric Community. Special thanks to @nilendraFabric , for the valuable insights. Their solution effectively addresses the root cause of the issue the dataflow executing before the notebook fully writes the cleaned data to the Lakehouse.

To apply the fix, follow these steps.

  1. Set the dataflow’s warehouse destination to Append(avoid Replace).
  2. Disable Auto create table to prevent unexpected resets.
  3. Rebuild the warehouse connection and clear cached credentials to ensure proper metadata synchronization.
  4. Introduce a 1–2 minute delay after the notebook execution to allow the data to be fully written before the dataflow runs.

If you need any additional information, please let us know. If the issue is resolved, kindly mark it As Accepted. This helps others find the solution more easily.

nilendraFabric
Super User
Super User

Hello @ronan_b6 

 

The dataflow might execute before the upstream notebook completes writing cleaned data to the lakehouse, resulting in an empty or outdated source

 

In the dataflow’s warehouse destination settings, ensure Append is explicitly selected (not defaulting to Replace).
2. Avoid using “Auto create table,” which can reset configurations.

Rebuild the warehouse connection or clear cached credentials.
• Add a 1–2 minute delay after the notebook to allow metadata synchronization.

Please see if some of the suggestions are helpful 

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.