Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
djbc1986
Frequent Visitor

How to make a Dataflow pipeline generic in Microsoft Fabric

Hi everyone,

I have a question regarding making a pipeline more generic.

Currently, I have a pipeline that executes a specific Dataflow Gen2 (CI/CD) as part of a larger process. The settings for the Dataflow activity are currently hardcoded, and everything works as expected.

djbc1986_1-1748284275674.png

 

When the pipeline runs successfully, we can see the corresponding workspaceId and dataflowId in the input section of the Dataflow activity (these values are just examples):

djbc1986_3-1748284624650.png

 

Our goal is to make the pipeline generic. To achieve this, the team has created a process that provides both the workspaceId and dataflowId as parameters. However, the execution fails every time.
Even when we manually trigger the pipeline using the same parameter values from a successful run, we receive the following error:

 

djbc1986_5-1748285639731.png

 

djbc1986_6-1748285772477.png

Unfortunately, the error code link does not provide useful information.

We also noticed that when using parameters, the dataflowtype property is missing from the Dataflow activity input. We're not sure why this happens:

djbc1986_0-1748294027861.png

 

 

Is it actually possible to achieve something like this?
We assume we're missing something, but we haven't been able to identify what or where the issue is.

Any help or guidance would be greatly appreciated!

Thanks in advance.

 

 

1 ACCEPTED SOLUTION
v-veshwara-msft
Community Support
Community Support

Hi @djbc1986 ,
Thanks for posting in Microsoft Fabric Community.

Parameterizing the DataflowId in the dataflow pipeline activity settings will only support the legacy Dataflow Gen2 version without CICD support. i.e. you cannot invoke Dataflows w/CICD support using parameterization of the DataflowId. This is only a temporary limitation until all Dataflows are converted to the new CICD enabled version.

Reference: Parameters 

I can confirm that running a Dataflow activity within a pipeline using parameterization works as expected for Dataflows without CICD enabled.

vveshwaramsft_2-1748343316952.png

 

 

vveshwaramsft_1-1748342964304.png

 

As this is a temporary limitation with Dataflows with CICD enabled, Microsoft is actively working on enabling support for parameterization of DataflowId in pipeline activities. This enhancement will be available once all Dataflows are converted to the new CICD-enabled version.

 

Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to give a kudos and Accept as the solution to help the other members find it more quickly.


Thank you.

 

 

 

View solution in original post

7 REPLIES 7
Stinkys
Advocate II
Advocate II

@djbc1986 did you end up getting this resolved?

v-veshwara-msft
Community Support
Community Support

Hi @djbc1986 ,

May I ask if the solution provided has addressed your needs? If so, please consider marking it as Accepted Solution to help others with similar queries.

If you need any further assistance, feel free to reach out.

Thank you.

v-veshwara-msft
Community Support
Community Support

Hi @djbc1986 ,

Following up to see if your query has been resolved. If any of the responses helped, please consider marking the relevant reply as the 'Accepted Solution' to assist others with similar questions.

If you're still facing issues, feel free to reach out.

Thank you.

v-veshwara-msft
Community Support
Community Support

Hi @djbc1986 ,
Thanks for posting in Microsoft Fabric Community.

Parameterizing the DataflowId in the dataflow pipeline activity settings will only support the legacy Dataflow Gen2 version without CICD support. i.e. you cannot invoke Dataflows w/CICD support using parameterization of the DataflowId. This is only a temporary limitation until all Dataflows are converted to the new CICD enabled version.

Reference: Parameters 

I can confirm that running a Dataflow activity within a pipeline using parameterization works as expected for Dataflows without CICD enabled.

vveshwaramsft_2-1748343316952.png

 

 

vveshwaramsft_1-1748342964304.png

 

As this is a temporary limitation with Dataflows with CICD enabled, Microsoft is actively working on enabling support for parameterization of DataflowId in pipeline activities. This enhancement will be available once all Dataflows are converted to the new CICD-enabled version.

 

Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to give a kudos and Accept as the solution to help the other members find it more quickly.


Thank you.

 

 

 

Hi @djbc1986 ,

Just checking in to see if you query is resolved and if any responses were helpful. If so, kindly consider marking the helpful reply as 'Accepted Solution' to help others with similar queries. 

Otherwise, feel free to reach out for further assistance.

Thank you.

burakkaragoz
Community Champion
Community Champion

Hi @djbc1986 ,

 

Thanks for the detailed explanation and the screenshots — that really helped clarify the issue.

From what you described, it looks like the pipeline fails when you switch from hardcoded values to parameterized ones, and the key difference is how the dataflow property is being passed.

When using hardcoded values, the dataflow input is a full resource path, like:

"dataflow": "/subscriptions/xxxx/resourceGroups/xxxx/providers/Microsoft.PowerPlatform/dataflows/xxxx"

But when using parameters, you're only passing the GUID:

"dataflow": "4d2a7c3b-8a31-4b79-9e4e-98e94f9a7ae2"

This is likely the root cause. The Dataflow activity expects the full resource path, not just the ID. So even though the GUID is correct, the activity doesn't know how to resolve it without the full context.

What you can try:

Update your parameterized pipeline to construct the full resource path dynamically using an expression like this:

concat('/subscriptions/<your-subscription-id>/resourceGroups/<your-rg>/providers/Microsoft.PowerPlatform/dataflows/', parameters('dataflowId'))

Do the same for the workspace if needed.

Make sure the final input looks like the hardcoded version, just built dynamically.

Also double-check that the Dataflow activity is actually using the parameter values in the correct fields — sometimes it's easy to pass them in but forget to bind them properly in the UI.

Let me know if you want help building that expression or testing it out.


If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.

Hi @burakkaragoz 

 

Thank you very much for your reply, and I apologize for the delay in providing feedback. We attempted to implement the solution, but we were advised to stop using this feature due to the information in the following article:

https://learn.microsoft.com/en-us/fabric/data-factory/dataflow-parameters#considerations-and-limitat... 

 

Additionally, we considered the response provided in this discussion:
https://community.fabric.microsoft.com/t5/Dataflow/Re-The-dataflow-has-never-been-published-error-wh... 

djbc1986_0-1749832297197.png

 

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.