March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Currently looking at a POC for Fabric and having a few issues with deployment/CICD early on. Ive probably spent a couple of hours this morning coming to the conclusion that Deployment Pipelines are next to useless for Pipelines under certain circumstances.
I have a Data Warehouse in DEV, TEST, PROD that uses a pipeline to populate from a Data Lake. The Connection for the Data Warehouse does not update between workspaces and deployment rules are not yet supported.
When building Azure solutions a common model would be to have a resource group each for DEV, TEST, PROD and deployment options were aplenty typically via Devops. It seems the functionality in Fabric is not yet present to deploy a Data Pipeline between DEV, TEST and PROD workspaces. My question is has anybody yet come across an alternative solution to deploying a pipeline between workspaces before I waste anymore of my time? (e.g Parameterised notebook, external tools) - Deployment pipelines are not ready, connections cannot be parameterised as like linked services, the REST APIs dont seem to have an 'update connection' function and without deploying a full capacity, I cant see how this could easily be done via Azure Devops.
I've come across similar queries and the answer seems to be 'just wait', which is fine, if a little frustrating. At the moment I cant in all honesty say that Fabric is ready for us to use - I'd like to at least feel like i've done a full investigation.
Thanks in advance.
Solved! Go to Solution.
I found a workaround that allows me to proceed, not quite as elegant as having connections at the workspace level but has the desired end result.
On the destination connection, selecting 'Dynamic Input' on the connection type - this exposes the workspace id and sql connection string that can then be parameterised. These values can then be stored as config in a control database (schema within Data Warehouse or file in the Lakehouse) and setup in a previous step in the pipeline. The end result is the same as in the Azure Resource Group based solution mentioned above.
I found a workaround that allows me to proceed, not quite as elegant as having connections at the workspace level but has the desired end result.
On the destination connection, selecting 'Dynamic Input' on the connection type - this exposes the workspace id and sql connection string that can then be parameterised. These values can then be stored as config in a control database (schema within Data Warehouse or file in the Lakehouse) and setup in a previous step in the pipeline. The end result is the same as in the Azure Resource Group based solution mentioned above.
Hi @DanielAmbler ,
Glad to know that you were able to get to a resolution. Please continue using Fabric Community on your further queries.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
9 | |
4 | |
3 | |
2 | |
1 |
User | Count |
---|---|
14 | |
10 | |
9 | |
5 | |
4 |