Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
msam1977_SM
Frequent Visitor

Editing a Large Semantic Model (LSM) Dataset with Incremental Refresh in Power BI Service

Hi Community,

I have a Power BI dataset published in the service that is enabled with Large Semantic Model Storage Format and Incremental Refresh.

 

When I try to download the PBIX from the Service, I get the error:
"This file can't be downloaded yet."

I understand this is by design for large models, but my challenge is that I need to edit the model because one of the tables stores data in JSON format, and I need to make updates.

 

Questions:

  1. Is there any way to temporarily disable Large Semantic Model Storage Format, download the PBIX, make changes, and then re-enable it?
  2. If I instead re-publish from a local PBIX, will that overwrite and break the existing dataset (which is being used actively in reports)?
  3. What is the best practice for making structural changes to datasets that are already in Large Model format with incremental refresh enabled?

The goal is to make sure I can edit the model without disrupting the existing reports and usage.

Any guidance from people who have faced this would be very helpful.

Thanks in advance!

1 ACCEPTED SOLUTION
vivien57
Impactful Individual
Impactful Individual

Hello @msam1977_SM ,

Question 1 :

No, it is not possible to temporarily disable the large storage format directly from the Power BI service. Once a model is published with this format and incremental refresh enabled, downloading the PBIX file is disabled for performance and data security reasons.

The only way to retrieve an editable file is to work from the original PBIX file used for publishing. If you no longer have it, you will need to rebuild the model manually or use tools such as Tabular Editor to make changes directly to the model in the service.

Question 2 :

Yes, republishing a local PBIX file will overwrite the existing dataset in the Power BI service. This includes:

  • The model structure
  • Incremental refresh settings
  • Relationships and measures
  • Data that has already been loaded

This may disrupt existing reports that depend on this dataset, especially if fields or tables are modified or deleted.

Question 3 :

 

  • Use Tabular Editor : Tabular Editor allows you to edit the model directly in the Power BI service via XMLA endpoint. You can edit measures, calculated columns, DAX expressions, and even some table properties.
    This avoids republishing the PBIX file and preserves existing data and reports.

  • Work on a local copy :  If you have a copy of the original PBIX file: Make the changes locally, publish to a test workspace, check that everything works (refresh, reports, etc.) and then publish to the production workspace outside of critical hours.

  • Use dynamic parameters and sources : For JSON data, consider pre-processing the data in Power Query or via an external pipeline (Azure Data Factory, Fabric, etc.).This keeps the model simpler and avoids frequent changes.


Don't hesitate to give a kudo and validate as a solution if my answer helped you.

Have a nice day,

Vivien

 



View solution in original post

5 REPLIES 5
Jai-Rathinavel
Super User
Super User

Hi @msam1977_SM , If you have a Fabric capacity enabled workspace. You can extract the bim file from the dataset by leveraging the below code on a Fabric Notebook

%pip install semantic-link-labs==0.11.3
import sempy_labs as labs
#below function saves the Model.bim file to your lakehouse 
my_bim = labs.get_semantic_model_bim(dataset='Your Dataset Name',workspace='Your Workspace Name')
#below function creates a copy of your existing dataset in your target workspace
labs.create_semantic_model_from_bim(dataset='Dataset-Copy',workspace='Your Workspace Name',bim_file=my_bim)

 

Once you find the copy you can download it, refresh it locally and work on it

JaiRathinavel_0-1756536742487.png

 

Thanks,

Jai Rathinavel

 




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!





msam1977_SM
Frequent Visitor

Hi Vivien, Thank you so much. The issue I’m facing now is this: as you mentioned, I opened the Semantic Model in Tabular Editor via the XMLA endpoint, which is correct. However, when I try to modify any queries, everything appears disabled—such as the source, and source type.
How would you suggest making changes in this scenario? thanks

Hello @msam1977_SM ,

Can you send screenshots to visualize the problem?

Have a nice day,

Vivien

msam1977_SM_0-1756822595962.png

above is screenshot from my Tabular Editor and you will see that Source part is disable. thks

vivien57
Impactful Individual
Impactful Individual

Hello @msam1977_SM ,

Question 1 :

No, it is not possible to temporarily disable the large storage format directly from the Power BI service. Once a model is published with this format and incremental refresh enabled, downloading the PBIX file is disabled for performance and data security reasons.

The only way to retrieve an editable file is to work from the original PBIX file used for publishing. If you no longer have it, you will need to rebuild the model manually or use tools such as Tabular Editor to make changes directly to the model in the service.

Question 2 :

Yes, republishing a local PBIX file will overwrite the existing dataset in the Power BI service. This includes:

  • The model structure
  • Incremental refresh settings
  • Relationships and measures
  • Data that has already been loaded

This may disrupt existing reports that depend on this dataset, especially if fields or tables are modified or deleted.

Question 3 :

 

  • Use Tabular Editor : Tabular Editor allows you to edit the model directly in the Power BI service via XMLA endpoint. You can edit measures, calculated columns, DAX expressions, and even some table properties.
    This avoids republishing the PBIX file and preserves existing data and reports.

  • Work on a local copy :  If you have a copy of the original PBIX file: Make the changes locally, publish to a test workspace, check that everything works (refresh, reports, etc.) and then publish to the production workspace outside of critical hours.

  • Use dynamic parameters and sources : For JSON data, consider pre-processing the data in Power Query or via an external pipeline (Azure Data Factory, Fabric, etc.).This keeps the model simpler and avoids frequent changes.


Don't hesitate to give a kudo and validate as a solution if my answer helped you.

Have a nice day,

Vivien

 



Helpful resources

Announcements
September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.