Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
DanielAmbler
Helper II
Helper II

Any alternative solutions for Deploying Data Pipeline Connections?

Currently looking at a POC for Fabric and having a few issues with deployment/CICD early on.  Ive probably spent a couple of hours this morning coming to the conclusion that Deployment Pipelines are next to useless for Pipelines under certain circumstances. 

 

I have a Data Warehouse in DEV, TEST, PROD that uses a pipeline to populate from a Data Lake.  The Connection for the Data Warehouse does not update between workspaces and deployment rules are not yet supported.

 

When building Azure solutions a common model would be to have a resource group each for DEV, TEST, PROD and deployment options were aplenty typically via Devops.  It seems the functionality in Fabric is not yet present to deploy a Data Pipeline between DEV, TEST and PROD workspaces.   My question is has anybody yet come across an alternative solution to deploying a pipeline between workspaces before I waste anymore of my time? (e.g Parameterised notebook, external tools)  - Deployment pipelines are not ready, connections cannot be parameterised as like linked services, the REST APIs dont seem to have an 'update connection' function and without deploying a full capacity, I cant see how this could easily be done via Azure Devops.

 

I've come across similar queries and the answer seems to be 'just wait', which is fine, if a little frustrating.  At the moment I cant in all honesty say that Fabric is ready for us to use - I'd like to at least feel like i've done a full investigation.

 

Thanks in advance.

1 ACCEPTED SOLUTION
DanielAmbler
Helper II
Helper II

I found a workaround that allows me to proceed, not quite as elegant as having connections at the workspace level but has the desired end result.

 

On the destination connection, selecting 'Dynamic Input' on the connection type - this exposes the workspace id and sql connection string that can then be parameterised.  These values can then be stored as config in a control database (schema within Data Warehouse or file in the Lakehouse) and setup in a previous step in the pipeline.  The end result is the same as in the Azure Resource Group based solution mentioned above.

 

DanielAmbler_0-1718030624935.png

 

View solution in original post

2 REPLIES 2
DanielAmbler
Helper II
Helper II

I found a workaround that allows me to proceed, not quite as elegant as having connections at the workspace level but has the desired end result.

 

On the destination connection, selecting 'Dynamic Input' on the connection type - this exposes the workspace id and sql connection string that can then be parameterised.  These values can then be stored as config in a control database (schema within Data Warehouse or file in the Lakehouse) and setup in a previous step in the pipeline.  The end result is the same as in the Azure Resource Group based solution mentioned above.

 

DanielAmbler_0-1718030624935.png

 

Anonymous
Not applicable

Hi @DanielAmbler ,

Glad to know that you were able to get to a resolution. Please continue using Fabric Community on your further queries.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.