Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
Matthew_G
Advocate I
Advocate I

Change semantic model size from small to large

Hi,

 

When creating a Warehouse a semtic model is created by default for that warehouse. I ran into a problem where the semantic model was no longer refreshing because it was too large. In the Workspace settings our admin changed the default storage format to be "Large semantic model storage format" but it looks like this is only applied on models going forward.

Matthew_G_0-1703839242056.png

 

How can I change an existing semantic model storage format? I don't see any options in the semantic model settings.

Matthew_G_1-1703839325992.png

 

Thanks!

Kind Regards,

Matthew

1 ACCEPTED SOLUTION

Hi @AndyDDC , I noticed that although new data was being imported into the fact table in the warehouse the data was not being refreshed/updated in the PBI report. It is a direct lake connection with no other imports. When I go to the refresh history under the semantic layer settings I get the following error:

An error has occurred while framing the dataset ########-####-####-####-########### error: Microsoft.AnalysisServices.OperationException: Failed to save modifications to the server. Error returned: 'Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 0 MB, memory limit 25600 MB, database size before command execution 33078 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more. '. at Microsoft.AnalysisServices.Tabular.Model.SaveChangesImpl(SaveFlags flags, Int32 maxParallelism) at Microsoft.ASWL.Service.Engine.SeethruAutoSync.SeethruAutoSyncManager.<InvokeFramingAsync>d__32.MoveNext() in /_/ASWL.Service/Engine/SeethruAutoSync/SeethruAutoSyncManager.cs:line 0.

I created a new warehouse that has the large semantic model by default, copied the data over to the new warehouse and was then able to connect and refresh the reports without issue. I assume that the small semantic model could not be refreshed due to a size limitation. 

View solution in original post

4 REPLIES 4
AndyDDC
Most Valuable Professional
Most Valuable Professional

Hi @Matthew_G what is the error you're getting on the default semantic model? I wouldn't have thought it would get full as it should only be connected via DirectQuery or DirectLake. Are there also import tables?

Hi @AndyDDC , I noticed that although new data was being imported into the fact table in the warehouse the data was not being refreshed/updated in the PBI report. It is a direct lake connection with no other imports. When I go to the refresh history under the semantic layer settings I get the following error:

An error has occurred while framing the dataset ########-####-####-####-########### error: Microsoft.AnalysisServices.OperationException: Failed to save modifications to the server. Error returned: 'Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 0 MB, memory limit 25600 MB, database size before command execution 33078 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more. '. at Microsoft.AnalysisServices.Tabular.Model.SaveChangesImpl(SaveFlags flags, Int32 maxParallelism) at Microsoft.ASWL.Service.Engine.SeethruAutoSync.SeethruAutoSyncManager.<InvokeFramingAsync>d__32.MoveNext() in /_/ASWL.Service/Engine/SeethruAutoSync/SeethruAutoSyncManager.cs:line 0.

I created a new warehouse that has the large semantic model by default, copied the data over to the new warehouse and was then able to connect and refresh the reports without issue. I assume that the small semantic model could not be refreshed due to a size limitation. 

AndyDDC
Most Valuable Professional
Most Valuable Professional

ok glad it's working now. So yes seems to be that the direct lake paging in limit comes into play, and you've spotted an issue of small dataset format is used.

 

I reckon the docs here should point that out

 

https://learn.microsoft.com/en-us/power-bi/enterprise/directlake-overview

Anonymous
Not applicable

Hi @Matthew_G 

Glad to know your query got resolved. Please continue using Fabric Community for your further queries.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.