Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
DCELL
Resolver I
Resolver I

Dataflow in pipeline succeeds but does nothing

Hello,

 

I've noticed that when I manually trigger my dataflow which fetches OData and appends it to a Fabric data lake, the dataflow succeeds and the append works.

But when I take that same dataflow and put it in a pipeline, and then trigger/schedule the pipeline, the dataflow says successful, but it doesn't actually append any data. The dataflow is not disabled in the pipeline.

This is is not due to the SQL endpoint lagging because I checked the max date and rowcount with a notebook.

1 ACCEPTED SOLUTION
DCELL
Resolver I
Resolver I

I figured out what the issue was: it's the general unclear behavior with Dataflows with closing vs publishing.

When you exit out of the Dataflow editor, you think it's saving your latest changes because when you go back you will see the most recent state. But in fact it saved the previous state, and you can't save a Dataflow without running it.

 

This means that if you are doing a fetch+append data on a weekly or monthly basis, which is what I am doing, you have to wait until the last day of your time window to publish your Dataflow, and only then can you enable the auto-run schedule.

View solution in original post

5 REPLIES 5
DCELL
Resolver I
Resolver I

I figured out what the issue was: it's the general unclear behavior with Dataflows with closing vs publishing.

When you exit out of the Dataflow editor, you think it's saving your latest changes because when you go back you will see the most recent state. But in fact it saved the previous state, and you can't save a Dataflow without running it.

 

This means that if you are doing a fetch+append data on a weekly or monthly basis, which is what I am doing, you have to wait until the last day of your time window to publish your Dataflow, and only then can you enable the auto-run schedule.

v-bmanikante
Community Support
Community Support

Hello @DCELL ,

 

As per my understanding, you have a dataflow that successfully appends data to your Fabric Data Lake when you run it manually. But when you trigger the same dataflow through a pipeline (either scheduled or manually), it says it ran successfully , yet no data is actually appended. Since, You have already confirmed this isn’t due to SQL endpoint delays by checking max date and row counts.

 

There might be a few possible reasons that causing the issue :

  • The dataflow might be running early inside the pipeline activity, before the data it's supposed to fetch is ready. Please try to add a  Wait step in your pipeline to delay the dataflow by 1–2 minutes. Make sure the dataflow step only runs after the required steps are complete.
  • The dataflow might be overwriting or skipping the append due to how it’s configured. Pleas double check the destination setting is "Append" only , not "Replace". Please Turn off  "Auto create table" option, which sometimes causes the dataflow to create a new table instead of appending to the existing role.
  • Sometimes, when dataflows are run through pipelines, they don’t use the latest or correct credentials. Please try to reconnect or reauthenticate your data source. Also clear the cached credentials and Refresh.
  • If the incoming data doesn’t match the existing table’s schema, it might silently fail to append. Please make sure the columns and data types match exactly. Please try to avoid using special characters or spaces in table names.

 

If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it!

Regards,

B Manikanteswara Reddy

Hi @DCELL ,

 

We wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?

 

If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

Please don't forget to give a "Kudos vbmanikante_0-1747397304411.png" – I’d truly appreciate it!

 

Regards,

B Manikanteswara Reddy

Hi @DCELL ,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?

If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

 

Please don't forget to give a "Kudos vbmanikante_1-1747649649867.png" – I’d truly appreciate it!

 

Regards,

B Manikanteswara Reddy

Hello, I've gone through the list and the issue must have been something else.

-The data was already ready

-The dataflow configuration is correct; when running the dataflow without the pipeline it worked. Only when that exact same dataflow is in a pipeline does it 'succeed' but without any effect

-Credentials - but wouldn't the dataflow in the pipeline simply fail in this case if the credentials aren't up to date?

-Schema is correct. The dataflow works outside of a pipeline, but not when inside a pipeline

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

Top Solution Authors
Top Kudoed Authors