Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
DanielAmbler
Helper II
Helper II

Any alternative solutions for Deploying Data Pipeline Connections?

Currently looking at a POC for Fabric and having a few issues with deployment/CICD early on.  Ive probably spent a couple of hours this morning coming to the conclusion that Deployment Pipelines are next to useless for Pipelines under certain circumstances. 

 

I have a Data Warehouse in DEV, TEST, PROD that uses a pipeline to populate from a Data Lake.  The Connection for the Data Warehouse does not update between workspaces and deployment rules are not yet supported.

 

When building Azure solutions a common model would be to have a resource group each for DEV, TEST, PROD and deployment options were aplenty typically via Devops.  It seems the functionality in Fabric is not yet present to deploy a Data Pipeline between DEV, TEST and PROD workspaces.   My question is has anybody yet come across an alternative solution to deploying a pipeline between workspaces before I waste anymore of my time? (e.g Parameterised notebook, external tools)  - Deployment pipelines are not ready, connections cannot be parameterised as like linked services, the REST APIs dont seem to have an 'update connection' function and without deploying a full capacity, I cant see how this could easily be done via Azure Devops.

 

I've come across similar queries and the answer seems to be 'just wait', which is fine, if a little frustrating.  At the moment I cant in all honesty say that Fabric is ready for us to use - I'd like to at least feel like i've done a full investigation.

 

Thanks in advance.

1 ACCEPTED SOLUTION
DanielAmbler
Helper II
Helper II

I found a workaround that allows me to proceed, not quite as elegant as having connections at the workspace level but has the desired end result.

 

On the destination connection, selecting 'Dynamic Input' on the connection type - this exposes the workspace id and sql connection string that can then be parameterised.  These values can then be stored as config in a control database (schema within Data Warehouse or file in the Lakehouse) and setup in a previous step in the pipeline.  The end result is the same as in the Azure Resource Group based solution mentioned above.

 

DanielAmbler_0-1718030624935.png

 

View solution in original post

2 REPLIES 2
DanielAmbler
Helper II
Helper II

I found a workaround that allows me to proceed, not quite as elegant as having connections at the workspace level but has the desired end result.

 

On the destination connection, selecting 'Dynamic Input' on the connection type - this exposes the workspace id and sql connection string that can then be parameterised.  These values can then be stored as config in a control database (schema within Data Warehouse or file in the Lakehouse) and setup in a previous step in the pipeline.  The end result is the same as in the Azure Resource Group based solution mentioned above.

 

DanielAmbler_0-1718030624935.png

 

Anonymous
Not applicable

Hi @DanielAmbler ,

Glad to know that you were able to get to a resolution. Please continue using Fabric Community on your further queries.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.