The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredAsk the Fabric Databases & App Development teams anything! Live on Reddit on August 26th. Learn more.
Hi everyone,
I'm currently working with deployment pipelines in Microsoft Fabric, and I'm encountering an issue with dynamically setting blob container names for different environments. I have three stages in my pipeline: Development, Stage, and Production.
Each stage is connected to the same blob storage account:
But I have three different containers:
Here is a screenshot of my deployment pipeline setup:
Has anyone successfully managed to configure such a setup? Any guidance or best practices would be greatly appreciated!
Thank you!
Solved! Go to Solution.
Data factory pipelines are not supporting ti take paeameters from deployment pipelines.
A suitable solutiion if the containername can be parameterized in the copy activity you can store a metadata file for each workspace with the same file name
And the copy activity reads from the same column in that excel sheet metadata
You can change the data in each folder separately
In dev : maetadata.xlsx column [container] has value dev
And same for staging and same for prod
Data factory pipelines are not supporting ti take paeameters from deployment pipelines.
A suitable solutiion if the containername can be parameterized in the copy activity you can store a metadata file for each workspace with the same file name
And the copy activity reads from the same column in that excel sheet metadata
You can change the data in each folder separately
In dev : maetadata.xlsx column [container] has value dev
And same for staging and same for prod
Please can anyone respond to my issue? I really need to get this to work
Update: I thought of creating a notebook which will directly access and copy the file from blob storage:
sas_token = "{sas_token}"
container_url = "https://{storage_account}.blob.core.windows.net/{container_name}"
container_client = ContainerClient.from_container_url(container_url + "?" + sas_token)
However, when I deployed the notebook in three "Dev", "Test", "Production" environments the URL changes in all of them automatically pointing to the dev storage container. If possible, can there be a way so that I can provide parameters in notebook dynamically so in each environment it fetches data from respective container?
Hi @usmanf07
Thanks for using Microsoft Fabric Community.
Apologies for the inconvenience.
Please reach out to our support team to gain deeper insights and explore potential solutions. It's highly recommended that you reach out to our support team. Their expertise will be invaluable in suggesting the most appropriate approach.
Please go ahead and raise a support ticket to reach our support team:
https://support.fabric.microsoft.com/support
After creating a Support ticket please provide the ticket number as it would help us to track for more information.
Thank you.
Hello, I submitted the ticket and my ticket number is: 2406120010002765
User | Count |
---|---|
2 | |
2 | |
1 | |
1 | |
1 |
User | Count |
---|---|
4 | |
3 | |
2 | |
2 | |
2 |