Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
anon97242
Advocate II
Advocate II

DataPipelines using DataflowGen2 using old code after DataflowGen2 update

Good Day
I am using a Data Pipeline with a DataflowGen2(CI/CD and parameters), when I update the the dataflow and rerun the pipeline the new code is not executed, but rather whatever code was present at pipeline build time.

As a work around I have been rebuiliding my pipeline every time I do a Dataflow change. Is this inteded behaviour?

Thanks!

1 ACCEPTED SOLUTION
v-lgarikapat
Community Support
Community Support

Hi @anon97242 ,

Thanks for reaching out to the Microsoft fabric community forum.

Yes, to some extent, this is currently expected due to how pipeline activity references work:
When you use a Dataflow Gen2 activity in a pipeline, it binds to the definition at build/deploy time, not dynamically at runtime.
This is especially relevant when you're using CI/CD with JSON templates, and deploying pipeline definitions from source control.

Recommended Practices:
Use Parameters Dynamically in Dataflow
Ensure that the Dataflow Gen2 is parameterized and you're passing values from the pipeline. This way, changes in the logic can be triggered by different parameter sets.
Re-publish/Re-deploy Pipelines After Dataflow Changes
When you update a Dataflow:
Re-open the pipeline in your workspace
Make a minor change (like a whitespace) and re-save or re-publish
This forces the pipeline to re-bind to the latest Dataflow version
Rebuild Pipeline as Part of CI/CD
In your CI/CD process (like Azure DevOps), include a step to re-export and re-deploy pipelines whenever a Dataflow is updated. This ensures pipeline definitions align with updated Dataflow logic.
Use Separate Deployment Pipelines for Dataflows and Pipelines
If you're managing Dataflows and Pipelines separately in source control, you can:
Track changes to Dataflows
Automatically trigger a pipeline build/redeploy when a Dataflow changes

Use public parameters in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn

Dataflow activity - Microsoft Fabric | Microsoft Learn

CI/CD for pipelines in Data Factory - Microsoft Fabric | Microsoft Learn

If this post helped resolve your issue, please consider giving it Kudos and marking it as the Accepted Solution. This not only acknowledges the support provided but also helps other community members find relevant solutions more easily.

We appreciate your engagement and thank you for being an active part of the community.

Best regards,
LakshmiNarayana
.

View solution in original post

3 REPLIES 3
v-lgarikapat
Community Support
Community Support

Hi @anon97242 ,

Thanks for reaching out to the Microsoft fabric community forum.

Yes, to some extent, this is currently expected due to how pipeline activity references work:
When you use a Dataflow Gen2 activity in a pipeline, it binds to the definition at build/deploy time, not dynamically at runtime.
This is especially relevant when you're using CI/CD with JSON templates, and deploying pipeline definitions from source control.

Recommended Practices:
Use Parameters Dynamically in Dataflow
Ensure that the Dataflow Gen2 is parameterized and you're passing values from the pipeline. This way, changes in the logic can be triggered by different parameter sets.
Re-publish/Re-deploy Pipelines After Dataflow Changes
When you update a Dataflow:
Re-open the pipeline in your workspace
Make a minor change (like a whitespace) and re-save or re-publish
This forces the pipeline to re-bind to the latest Dataflow version
Rebuild Pipeline as Part of CI/CD
In your CI/CD process (like Azure DevOps), include a step to re-export and re-deploy pipelines whenever a Dataflow is updated. This ensures pipeline definitions align with updated Dataflow logic.
Use Separate Deployment Pipelines for Dataflows and Pipelines
If you're managing Dataflows and Pipelines separately in source control, you can:
Track changes to Dataflows
Automatically trigger a pipeline build/redeploy when a Dataflow changes

Use public parameters in Dataflow Gen2 (Preview) - Microsoft Fabric | Microsoft Learn

Dataflow activity - Microsoft Fabric | Microsoft Learn

CI/CD for pipelines in Data Factory - Microsoft Fabric | Microsoft Learn

If this post helped resolve your issue, please consider giving it Kudos and marking it as the Accepted Solution. This not only acknowledges the support provided but also helps other community members find relevant solutions more easily.

We appreciate your engagement and thank you for being an active part of the community.

Best regards,
LakshmiNarayana
.

@v-lgarikapat 

Thank you for the confirmation, as well as the detailed recommended practice. Much appreciated!

Hi @anon97242 ,

If your issue has been resolved, please consider marking the most helpful reply as the accepted solution. This helps other community members who may encounter the same issue to find answers more efficiently.

If you're still facing challenges, feel free to let us know we’ll be glad to assist you further.

Looking forward to your response.

Best regards,
LakshmiNarayana.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.