Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi All
I have created 3 workspaces to act as our Development, Test & Production enviroments. I have an issue with setting either the on prem copy data activity destination connection or Notebook activity Workspace setting in the pipeline. Screenshot below.
I can only select the Connection or Workspace by manually using the drop down box. However I need these activities to have an option to dynamically set the connection/workspace based on the workspace the pipeline is in. Otherwise I have to manually amend each activity in each pipeline to the correct workspace once it is deployed to the that enviroment/workspace. Can anyone help with this? Is this amendment part of any Fabric future releases?
Thanks
Neil
Solved! Go to Solution.
@MisterSmith, I see parameterization of pipeline connections is on the release plan, but in the interim you should be able to use the Use Dynamic Content option (assuming no limitations due to your using On-Prem) Once you input a value, it will reveal a variety of other fields depending on your connection type.
In my scenario, I need to deploy these pipelines across multiple workspaces, so I created a separate "Environment Variables" pipeline that has a set of return values with the IDs needed for the current workspace. Then, I just have my triggering pipeline invoke that "global" pipeline to get the IDs shown above for my lakehouses, workspaces, etc. This can then pass these variables to other child/downstream pipelines through parameters.
today you can use dynamic connector to parameterize Lakehouse/Data warehouse, and the support for general connection parameterization is on the roadmap.
Hi @MisterSmith
Can you tell me if your problem is solved? If yes, please accept it as solution.
Regards,
Nono Chen
@MisterSmith, I see parameterization of pipeline connections is on the release plan, but in the interim you should be able to use the Use Dynamic Content option (assuming no limitations due to your using On-Prem) Once you input a value, it will reveal a variety of other fields depending on your connection type.
In my scenario, I need to deploy these pipelines across multiple workspaces, so I created a separate "Environment Variables" pipeline that has a set of return values with the IDs needed for the current workspace. Then, I just have my triggering pipeline invoke that "global" pipeline to get the IDs shown above for my lakehouses, workspaces, etc. This can then pass these variables to other child/downstream pipelines through parameters.
I tried this approach but I am getting below error. What is the solution for this issue?
Wanted to know how we can provide the source connection dynamically rather than manually selecting from the dropdown for copy activity and lookups as we have to deploy the pipeline to other workspaces which uses different connection.
Brilliant thanks I will try this and let you know 😀
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
2 | |
2 | |
2 | |
2 | |
2 |