Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The FabCon + SQLCon recap series starts April 14th at 8am Pacific. If you’re tracking where AI is going inside Fabric, this first session is a can't miss. Register now

Reply
pstauduhar
Frequent Visitor

Edit Model in Service automatical converts to Large Semantic model in Pro Workspace

We recently enabled the preview feature to be able to edit models in the service. This had been working fine for us recently.

 

However, beginning yesterday, when we went to edit one of our semantic models, it automatically converted the dataset to large semantic model format

pstauduhar_0-1741273104145.png

 

After this conversion, the editing page doesn't load and posts an error message that our dataset size has exceeded the allowed size for our pro capacity. The amount is double whatever the current dataset size is. This was a fairly large dataset (809 MB) to begin with. 

 

This semantic model is now unable to refresh, and the report pages will not load for end users. 

 

Has anybody run into this issue and been able to resolve it. All the documentation around Large semantic models is for Premium capacity. There is no documentation that i can find around the Preview feature that indicates this behavior, and was working fine until yesterday.

 

Thanks

 

1 ACCEPTED SOLUTION
dkdata
Frequent Visitor

Hi, I am seeing the same issue and its happening on all datasets.

The only way i was able to resolve this was to upload a smaller version of the same model by filtering the data. Then converting it small dataset and Then uploading the full model.

This was not the case a previously as I was able to edit the model without any conversion. Not sure if this will be the intended functionality going forward or just a bug. If its intended then this feature will just become useless to the Pro users that have large models

The service does not allow you to do anything to the model once the size of the model is >1GB on Pro licence.

This is a preview feature things are likely to break. Hopefully this is just a bug they revert it back to the previous experience

View solution in original post

5 REPLIES 5
Anonymous
Not applicable

Hi @pstauduhar ,

Thank you for reaching out to the Microsoft Fabric Community. Also, thank you @dkdata  for your valuable input. As already mentioned, since this is a preview feature, there might be some limitations. Please refer to the documentation below:

Edit data models in the Power BI service (preview) - Power BI | Microsoft Learn

 

Additionally, please try these troubleshooting steps which might help resolve the issue. First, check your Workspace Settings under Power BI Service > Workspace Settings to see if Large Semantic Model is enabled. If it is, try disabling it and republishing your dataset. You can also attempt to republish the dataset from Power BI Desktop under a different name to keep it as a standard semantic model. If the issue persists, try publishing the dataset to a different Pro workspace to see if the conversion is workspace-specific.

 

As per Microsoft’s official documentation, Large Semantic Models are meant for Premium capacity, so if this is an intentional change, Pro users with large datasets may need to either optimize dataset size or consider Premium capacity.

 

I hope my suggestions give you good idea, if you need any further assistance, feel free to reach out.

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you. 

dkdata
Frequent Visitor

Hi, I am seeing the same issue and its happening on all datasets.

The only way i was able to resolve this was to upload a smaller version of the same model by filtering the data. Then converting it small dataset and Then uploading the full model.

This was not the case a previously as I was able to edit the model without any conversion. Not sure if this will be the intended functionality going forward or just a bug. If its intended then this feature will just become useless to the Pro users that have large models

The service does not allow you to do anything to the model once the size of the model is >1GB on Pro licence.

This is a preview feature things are likely to break. Hopefully this is just a bug they revert it back to the previous experience

This is exactly what I did prior to seeing your comment and it worked. It was very frustrating - I really hope that it is a bug during the preview period 

Thats great news. I tried editinng my data model yesterday and it seemed to be resolved. So it seemed like a bug.

pstauduhar
Frequent Visitor

I would add that we are also unable to revert the model to the small dataset size - we receive this message everytime we try

pstauduhar_0-1741273448487.png

 

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Power BI Update Carousel

Power BI Community Update - March 2026

Check out the March 2026 Power BI update to learn about new features.

Top Solution Authors