Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreShape the future of the Fabric Community! Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions. Take survey.
Hi fellow Fabricators,
The set-up:
Currently you can't use a deployment pipleline to deploy data pipelines. We have three different workspaces, basically dev, test, prod - we did it this way to make use of the deployment pipline but we are encountering some issues.
I have a data pipleline that executes a Notebook and then executes some stored procs to process data from the Lakehouse to the Warehouse. Each workspace has its own Lakehouse/Warehouse with the same name.
The challenge:
I don't want to have to re-create the pipleline multiple times so I though of possibly creating it once and using variables/scope variables instead.
For a Notebook its easy enough to make use of a variable - I know I could use @pipeline().DataFactory for the Workspace object ID instead. In any case this seems to work fine.
For a stored proc however I would need the Warehouse ID and I am not sure how to go about getting that.
Has someone found a solution to this? Or is there a better way to apply changes accross workspaces?
Thanks!
Regards,
Matt
Solved! Go to Solution.
Hi @Matthew_G
Thanks for using Fabric Community.
You need to select the "Add dynamic content" option within the drop down and pass in the GUID.
You need to create a new parameter for the stored procedure activity.
The parameter value should be Lakehouse/Data Warehouse/KQL Database object ID. To get your Lakehouse/Data Warehouse/KQL Database object ID, open your Lakehouse/Data Warehouse/KQL Database in your workspace, and the ID is after /lakehouses/
or /datawarehouses/
or /databases/
in your URL.
Please refer to this link for more information: Link1
Hope this helps. Please let me know if you have any further questions.
Thanks, it looks a bit different for the Stored Procedure Activity.
When you add dynamic content to the Warehouse it then also requests a SQL connection string.
I then added two parameters, the connection string and the warehouse guid, adding it as dynamic content.
This now works.
@Matthew_G , @v-nikhilan-msft ,
I have added the same parameters as Warehouse ID, SQL connection string and use them in Stored Procedure Activity. It works well in current Dev workspace. But when I use release pipeline deploy it to next stage workspace (UAT), the parameters are still pointing to Dev workspace warehouse. How can we change these parameters during deployment?
Thanks
jerry
Hi Jerry, yeah, you will need to change the parameters in each pipeline though each time you deploy they will be overwritten. Unfortunately even in the deployment pipeline you can't set the parameters. What you could do is have two pipelines in each workspace where pipeline 1 kicks off pipeline 2 and passess the parameters. That way when you deploy changes you can just deploy pipeline 2 and it shouldnt break anything. I'm sure there is a better solution out there.
I would like to do something similar with referencing my local lakehouse. I want to have lakehouses of the same name in each of my testing, staging, and prod workspaces. How can I reference the lakehouse of the name "My_Lakhouse" in this local workspace? Something like My_Lakehouse.@pipeline().DataFactory ?
Found an example for Lakehouse here: Parameters - Microsoft Fabric | Microsoft Learn
Hi, have you found a way to handle upgrades to the pipeline(s)? Although you can copy across workspaces you can't overwrite an existing pipeline so with this and not being able to 'release' DataFlows we're currently sticking with ADF/Synapse for all bar the simplest requirements (we have some customer integrations with >50 complex pipelines)
Unfortunately not really. For the moment I make the changes to the pipeline in dev and then rename/delete the previous version in the test etc environments and save as the new pipeline then update the variables in the pipeline. So there is still quite a bit of admin needed. Ideally it would be great if you could deploy data pipelines or at least the ability to create workspace environmental variables and use them in the pipelines.
I think it's going to get caught in the trap of referencing by id/guid rather than name so there'll always be an issue when moving between workspaces. They may need to take a leaf from the Power Apps book and have some kind of 'solution' package so that related items are managed together to maintain referential 'integrity'
Hi @Matthew_G
Thanks for using Fabric Community.
You need to select the "Add dynamic content" option within the drop down and pass in the GUID.
You need to create a new parameter for the stored procedure activity.
The parameter value should be Lakehouse/Data Warehouse/KQL Database object ID. To get your Lakehouse/Data Warehouse/KQL Database object ID, open your Lakehouse/Data Warehouse/KQL Database in your workspace, and the ID is after /lakehouses/
or /datawarehouses/
or /databases/
in your URL.
Please refer to this link for more information: Link1
Hope this helps. Please let me know if you have any further questions.
Thanks, it looks a bit different for the Stored Procedure Activity.
When you add dynamic content to the Warehouse it then also requests a SQL connection string.
I then added two parameters, the connection string and the warehouse guid, adding it as dynamic content.
This now works.
User | Count |
---|---|
6 | |
5 | |
4 | |
2 | |
1 |
User | Count |
---|---|
16 | |
13 | |
11 | |
7 | |
6 |