March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi,
When creating a Warehouse a semtic model is created by default for that warehouse. I ran into a problem where the semantic model was no longer refreshing because it was too large. In the Workspace settings our admin changed the default storage format to be "Large semantic model storage format" but it looks like this is only applied on models going forward.
How can I change an existing semantic model storage format? I don't see any options in the semantic model settings.
Thanks!
Kind Regards,
Matthew
Solved! Go to Solution.
Hi @AndyDDC , I noticed that although new data was being imported into the fact table in the warehouse the data was not being refreshed/updated in the PBI report. It is a direct lake connection with no other imports. When I go to the refresh history under the semantic layer settings I get the following error:
An error has occurred while framing the dataset ########-####-####-####-########### error: Microsoft.AnalysisServices.OperationException: Failed to save modifications to the server. Error returned: 'Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 0 MB, memory limit 25600 MB, database size before command execution 33078 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more. '. at Microsoft.AnalysisServices.Tabular.Model.SaveChangesImpl(SaveFlags flags, Int32 maxParallelism) at Microsoft.ASWL.Service.Engine.SeethruAutoSync.SeethruAutoSyncManager.<InvokeFramingAsync>d__32.MoveNext() in /_/ASWL.Service/Engine/SeethruAutoSync/SeethruAutoSyncManager.cs:line 0.
I created a new warehouse that has the large semantic model by default, copied the data over to the new warehouse and was then able to connect and refresh the reports without issue. I assume that the small semantic model could not be refreshed due to a size limitation.
Hi @Matthew_G what is the error you're getting on the default semantic model? I wouldn't have thought it would get full as it should only be connected via DirectQuery or DirectLake. Are there also import tables?
Hi @AndyDDC , I noticed that although new data was being imported into the fact table in the warehouse the data was not being refreshed/updated in the PBI report. It is a direct lake connection with no other imports. When I go to the refresh history under the semantic layer settings I get the following error:
An error has occurred while framing the dataset ########-####-####-####-########### error: Microsoft.AnalysisServices.OperationException: Failed to save modifications to the server. Error returned: 'Resource Governing: This operation was canceled because there wasn't enough memory to finish running it. Either reduce the memory footprint of your dataset by doing things such as limiting the amount of imported data, or if using Power BI Premium, increase the memory of the Premium capacity where this dataset is hosted. More details: consumed memory 0 MB, memory limit 25600 MB, database size before command execution 33078 MB. See https://go.microsoft.com/fwlink/?linkid=2159753 to learn more. '. at Microsoft.AnalysisServices.Tabular.Model.SaveChangesImpl(SaveFlags flags, Int32 maxParallelism) at Microsoft.ASWL.Service.Engine.SeethruAutoSync.SeethruAutoSyncManager.<InvokeFramingAsync>d__32.MoveNext() in /_/ASWL.Service/Engine/SeethruAutoSync/SeethruAutoSyncManager.cs:line 0.
I created a new warehouse that has the large semantic model by default, copied the data over to the new warehouse and was then able to connect and refresh the reports without issue. I assume that the small semantic model could not be refreshed due to a size limitation.
ok glad it's working now. So yes seems to be that the direct lake paging in limit comes into play, and you've spotted an issue of small dataset format is used.
I reckon the docs here should point that out
https://learn.microsoft.com/en-us/power-bi/enterprise/directlake-overview
Hi @Matthew_G
Glad to know your query got resolved. Please continue using Fabric Community for your further queries.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.