Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Eric7
Resolver II
Resolver II

Deployment Pipline, staging

Hi, a question about best practices when using Deployment Pipelines. When deploying a dataset with a report from Test to Production, it seems that only the meta data are copied, not the refreshed data in the dataset. And the production versjon of the dataset is overwritten, so it needs to be refreshed before the production deployment can be used by anyone. In my case this takes an hour, so the report which is based on the dataset is really unavailable to the users for all that time. It would have been better if the original dataset in production was not overwritten until the refresh was finished, as is the case when publishing directly from Power BI Desktop to Power BI Service. Is there a better way to do this?

 

Eric7_0-1645022610239.png

 

1 ACCEPTED SOLUTION

Ok. I will check out the pipeline extension you referred to. I have now "solved" the issue by creating alternate workspaces with their own pipeline which I now use for staging and swapping when deployting into production.

View solution in original post

4 REPLIES 4
Nimrod_Shalit
Power BI Team
Power BI Team

@Eric7,

to re-iterate on @edhans words- as long as there aren't breaking changes, you should be able to deploy it into Production dataset and without any downtime for the users. When there are breaking changes, you will get a notification and you can decide to stop the deployment.
Another option is to use pipeline's ADO extension to schedule deployments at night time, and run a refresh right after that.

Ok. I will check out the pipeline extension you referred to. I have now "solved" the issue by creating alternate workspaces with their own pipeline which I now use for staging and swapping when deployting into production.

edhans
Super User
Super User

 My understanding is the data will not be overwritten unless you changed the model, and the model of the production data doesn't match up with the new metadata from Test. 

 

The assumption is (right or wrong) by the time you get to Production, the model is pretty stable.

Can you give more specifics on the changes you are making?



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Ok thanks, so what I am looking for is a way to deploy from test to production (by using the Deployment Pipeline) with changes to the model where the target dataset is not overwritten until the very moment that the refresh is finished. So that the report stays accessible/functionable for the users all the time. Like the way it works (I assume) when I deploy from Power BI Desktop to Power BI Service. Having a downtime for one hour for every such deploy is not really acceptable for our customers.

 

Is it perhaps possible to achieve this some other way, e.g. by using the XMLA enpoints?

 

It could be useful to be able to schedule the deployment to nighttime in some way.

 

E

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.