Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
RSebastiani74
Frequent Visitor

Dataflow Gen2 accepting Parameters-Values from Notebook - HELP

Hello,

 

With the implementation of Dataflow Gen2 (CI/CD) I am in need to pass SQL Server hostnames to a set of dataflows that extract data from different tables from the same different SQL Servers located in different AWS Environments.  I was able to create a notebook with the variable names and then bring it into a datapipeline where I set a variable and then call a Dataflow execution step where I set the parameter to use the variable value.

 

Once in the Dataflow and I try to create a new Paramaters from the Public parameters, One, i dont see the parameter I created under the pipeline dataflow settings or two, i can't create a new parameter that uses the variable value I created in the data pipeline.  Is this even possible? 

 

The goal of this exercise is to provide a set of variables with the different SQL Hostnames of all the different AWS environments so every dataflow that I have (10) that are pulling different tables from the same environments, can just use the variable values instead of hardcoding the actual hostnames in every dataflow.  If i have to update the hostnames, I can just do it in the notebook once and not in every dataflow.

 

Any help is much appreciated.  Thank you.

 

RSebastiani74_0-1746382174498.png

 

 

 

1 ACCEPTED SOLUTION
v-karpurapud
Community Support
Community Support

Hi @RSebastiani74 

 

Thank you for reaching out to the Microsoft Fabric Community Forum.

 

In Dataflow Gen2, public parameters must be explicitly defined within the dataflow before they can be overridden via a pipeline or REST API. If you're unable to see the parameter in the pipeline’s Dataflow activity settings, it likely means either the parameter hasn't been marked as public or its usage is not supported for override.

 

To make parameters available for pipeline integration, open your Dataflow Gen2 in Power Query, create the required parameters under "Manage Parameters" (e.g., SQLHostName), use them in your queries such as in the SQL Server connection string, and mark them as public so they can be overridden externally.

 

To pass parameters via a pipeline, add a Dataflow activity, select the appropriate Dataflow Gen2, and under the Parameters section, bind the listed public parameters to pipeline variables or notebook outputs.

 

Currently, Dataflow Gen2 does not support overriding the resource path of a data source (e.g., dynamically changing the SQL Server hostname). Even if you pass a different value, the connection string remains fixed to what was authored in the dataflow. This is a known limitation in the public preview of parameter support.

vkarpurapud_0-1746438328362.png

 

As a work around this limitation, create a generic dataflow template with a placeholder hostname, then use notebooks or pipelines to dynamically copy and deploy the dataflow with the correct hostname injected into the query or connection string, or alternatively, perform the data extraction directly in notebooks and write the results to a Lakehouse.

 For more details, refer to the official documentation on Dataflow Gen2 parameters.

 

If this response resolves your query, kindly mark it as the Accepted Solution to help other community members. A Kudos is also appreciated if you found the response helpful.

 

Thank you!

 

View solution in original post

4 REPLIES 4
v-karpurapud
Community Support
Community Support

Hi @RSebastiani74 

I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.

Thank you.

v-karpurapud
Community Support
Community Support

Hi @RSebastiani74 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.

Thank you.

 

v-karpurapud
Community Support
Community Support

Hi @RSebastiani74 

We have not received a response from you regarding the query and were following up to check if you have found a resolution from the information provided below. If you find the response helpful, please mark it as the accepted solution and provide kudos, as this will help other members with similar queries.

Thank You!

v-karpurapud
Community Support
Community Support

Hi @RSebastiani74 

 

Thank you for reaching out to the Microsoft Fabric Community Forum.

 

In Dataflow Gen2, public parameters must be explicitly defined within the dataflow before they can be overridden via a pipeline or REST API. If you're unable to see the parameter in the pipeline’s Dataflow activity settings, it likely means either the parameter hasn't been marked as public or its usage is not supported for override.

 

To make parameters available for pipeline integration, open your Dataflow Gen2 in Power Query, create the required parameters under "Manage Parameters" (e.g., SQLHostName), use them in your queries such as in the SQL Server connection string, and mark them as public so they can be overridden externally.

 

To pass parameters via a pipeline, add a Dataflow activity, select the appropriate Dataflow Gen2, and under the Parameters section, bind the listed public parameters to pipeline variables or notebook outputs.

 

Currently, Dataflow Gen2 does not support overriding the resource path of a data source (e.g., dynamically changing the SQL Server hostname). Even if you pass a different value, the connection string remains fixed to what was authored in the dataflow. This is a known limitation in the public preview of parameter support.

vkarpurapud_0-1746438328362.png

 

As a work around this limitation, create a generic dataflow template with a placeholder hostname, then use notebooks or pipelines to dynamically copy and deploy the dataflow with the correct hostname injected into the query or connection string, or alternatively, perform the data extraction directly in notebooks and write the results to a Lakehouse.

 For more details, refer to the official documentation on Dataflow Gen2 parameters.

 

If this response resolves your query, kindly mark it as the Accepted Solution to help other community members. A Kudos is also appreciated if you found the response helpful.

 

Thank you!

 

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.