Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.

Reply
surbhi
Helper I
Helper I

Powerbi deployment pipeline

i am creating deployment pipeline with 2 stages only dev and prod. In dev workspace i publish my report with azure blob storage having private endpoints as source also this storage gets refreshed every once a month with new csv. The prod workspace which will be for end users i want to get this workspace also gets refreshed automatically based on scheduled refresh when storage account gets new csv. Do i need to manually set connection in prod space too. Please help

2 ACCEPTED SOLUTIONS
rohit1991
Super User
Super User

Hi @surbhi 

No, you don’t have to reconnect every month. After the first deploy, set the data source credentials once in the Prod workspace (they don’t copy from Dev). Keep your blob path in parameters and use deployment rules so Dev values switch to Prod automatically. Because you’re on private endpoints, make sure Prod uses the correct gateway (Managed VNet or on-prem/VNet). For refresh, either keep a normal schedule or use Power Automate to trigger a refresh when a new CSV lands in the storage. Optional: turn on Incremental Refresh if the file appends data to speed things up.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

View solution in original post

v-hashadapu
Community Support
Community Support

Hi @surbhi , Thank you for reaching out to the Microsoft Community Forum.

 

@rohit1991  is correct, data source credentials do not carry over from Dev to Prod So, you’ll need to set them once in the Prod workspace and the best practices such as using parameters for your blob path, ensuring the correct gateway is configured for private endpoints and handling refresh either through a schedule or with Power Automate when a new file lands are pretty good.

 

In your specific case though, since both Dev and Prod point to the same blob storage account, you don’t actually need to set up deployment rules for different parameter values because the path stays the same across environments. You can still keep parameters defined for flexibility, but the values won’t change between stages. Practically, all you need to do is configure the credentials in Prod once, make sure the dataset there is using the right gateway and set up either a scheduled refresh or an automated trigger when new files are added to storage.

 

Thank you @rohit1991  for your continuous contributions to the community.

View solution in original post

4 REPLIES 4
v-hashadapu
Community Support
Community Support

Hi @surbhi , Thank you for reaching out to the Microsoft Community Forum.

 

@rohit1991  is correct, data source credentials do not carry over from Dev to Prod So, you’ll need to set them once in the Prod workspace and the best practices such as using parameters for your blob path, ensuring the correct gateway is configured for private endpoints and handling refresh either through a schedule or with Power Automate when a new file lands are pretty good.

 

In your specific case though, since both Dev and Prod point to the same blob storage account, you don’t actually need to set up deployment rules for different parameter values because the path stays the same across environments. You can still keep parameters defined for flexibility, but the values won’t change between stages. Practically, all you need to do is configure the credentials in Prod once, make sure the dataset there is using the right gateway and set up either a scheduled refresh or an automated trigger when new files are added to storage.

 

Thank you @rohit1991  for your continuous contributions to the community.

rohit1991
Super User
Super User

Hi @surbhi 

No, you don’t have to reconnect every month. After the first deploy, set the data source credentials once in the Prod workspace (they don’t copy from Dev). Keep your blob path in parameters and use deployment rules so Dev values switch to Prod automatically. Because you’re on private endpoints, make sure Prod uses the correct gateway (Managed VNet or on-prem/VNet). For refresh, either keep a normal schedule or use Power Automate to trigger a refresh when a new CSV lands in the storage. Optional: turn on Incremental Refresh if the file appends data to speed things up.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

thanks @rohit1991 what rules do i have to exactly set up for prod workspace as i don't get this part  "Keep your blob path in parameters and use deployment rules so Dev values switch to Prod automatically" how to do this. Note :  we don't have separate dev and prod blob storage account,only one storage is there.

Hi @surbhi 

Could you please try below steps: 

1. In Power Query, create parameters: pAccountUrl, pContainer, pFolder. Use them in your Source (no hard-coded path).

2. In Deployment pipeline >> your dataset >> … >> Deployment rules >> Parameters set:

  • pAccountUrl >> same in Dev & Prod (one storage account)

  • pContainer >> same in Dev & Prod

  • pFolder >> Dev: dev/files >> Prod: prod/files (or the same value if you don’t have two folders)

3. Click Deploy, then in Prod >> Dataset settings set Data source credentials once.

4. Turn on Scheduled refresh (or Power Automate trigger on new blob).


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors