Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
FWPBIDev
Frequent Visitor

Quantifying LSM format vs SSM format with P2 Capacity - Same Model; Separate Workspaces

We have a semantic model approaching 10GB and growing. It's been in small semantic model format since initial development. This report utilizes RLS and hundreds of subscriptions. We get irregular spikes (surmising due certain subscriptions kicking off since this is what we've traced back to so far, but it's not a definitive conclusion yet).

We made a copy of the same model and report and placed them in another P2 workspace (both workspaces are P2) but converted the model to large semantic model format. Along with continued troubleshooting of our spikes, we'd like to quantify to some degree the performance gains from a capacity utilization standpoint we've received moving to LSM format. 

Does anyone have any idea how best to achieve this? Looking for any and all suggestions. Thanks in advance.


1 ACCEPTED SOLUTION
Poojara_D12
Super User
Super User

Hi @FWPBIDev 

To quantify the performance gains from moving a Power BI semantic model from a small semantic model (SSM) format to a large semantic model (LSM) format, especially in terms of capacity utilization and query performance, there are several steps you can take. First, you should leverage the Power BI Admin Portal to monitor key capacity metrics such as CPU usage, memory consumption, and query performance in both workspaces. By comparing metrics like query duration and response times before and after switching to the LSM, you can directly assess the impact of the model format change. Additionally, using tools like Power BI Query Diagnostics allows you to break down the time spent in different phases of query execution, helping identify any performance bottlenecks. For a more structured approach, consider running load tests to simulate real-world usage, especially focusing on scenarios involving hundreds of subscriptions, which are suspected to cause performance spikes. By triggering the same set of subscriptions in both the SSM and LSM environments, you can track query execution times and system resource utilization to measure the difference in handling high-demand scenarios. Another key area to explore is analyzing subscription activity, which could give insights into whether the spikes are caused by specific report refreshes or concurrent subscriptions. The Power BI Premium Capacity Metrics App can further help track resource consumption in your P2 capacity, providing a more granular view of resource utilization, and allowing you to correlate usage patterns with report activity. Additionally, it’s important to gather feedback from end-users regarding report responsiveness, as subjective user experience can complement the data-driven analysis. By following these steps, you'll be able to quantify performance improvements in terms of reduced resource consumption, better handling of concurrent loads, and overall system efficiency after migrating to the LSM format.

 

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"

Kind Regards,
Poojara - Proud to be a Super User
Data Analyst | MSBI Developer | Power BI Consultant
Consider Subscribing my YouTube for Beginners/Advance Concepts: https://youtube.com/@biconcepts?si=04iw9SYI2HN80HKS

View solution in original post

10 REPLIES 10
rohit1991
Super User
Super User

Hi @FWPBIDev ,
To quantify performance gains from LSM vs. SSM in P2 capacity:

  1. Use Fabric Capacity Metrics App – Monitor CPU, memory, and query performance.
  2. Load Testing – Simulate user interactions and subscriptions, compare response times.
  3. Refresh Performance – Track refresh times between models.
  4. Query Performance – Use DAX Studio to analyze and compare execution times.

This will highlight LSM's efficiency in handling larger models and concurrent processes.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!
Poojara_D12
Super User
Super User

Hi @FWPBIDev 

To quantify the performance gains from moving a Power BI semantic model from a small semantic model (SSM) format to a large semantic model (LSM) format, especially in terms of capacity utilization and query performance, there are several steps you can take. First, you should leverage the Power BI Admin Portal to monitor key capacity metrics such as CPU usage, memory consumption, and query performance in both workspaces. By comparing metrics like query duration and response times before and after switching to the LSM, you can directly assess the impact of the model format change. Additionally, using tools like Power BI Query Diagnostics allows you to break down the time spent in different phases of query execution, helping identify any performance bottlenecks. For a more structured approach, consider running load tests to simulate real-world usage, especially focusing on scenarios involving hundreds of subscriptions, which are suspected to cause performance spikes. By triggering the same set of subscriptions in both the SSM and LSM environments, you can track query execution times and system resource utilization to measure the difference in handling high-demand scenarios. Another key area to explore is analyzing subscription activity, which could give insights into whether the spikes are caused by specific report refreshes or concurrent subscriptions. The Power BI Premium Capacity Metrics App can further help track resource consumption in your P2 capacity, providing a more granular view of resource utilization, and allowing you to correlate usage patterns with report activity. Additionally, it’s important to gather feedback from end-users regarding report responsiveness, as subjective user experience can complement the data-driven analysis. By following these steps, you'll be able to quantify performance improvements in terms of reduced resource consumption, better handling of concurrent loads, and overall system efficiency after migrating to the LSM format.

 

 

Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"

Kind Regards,
Poojara - Proud to be a Super User
Data Analyst | MSBI Developer | Power BI Consultant
Consider Subscribing my YouTube for Beginners/Advance Concepts: https://youtube.com/@biconcepts?si=04iw9SYI2HN80HKS

Thank you very much

FWPBIDev
Frequent Visitor

So on the model we've been using that's under SSM format, the model size is just south of 10GB. The model we copied to the secondary workspace and opted for LSM is now showing 22GB model size. 

We're thinking the 22GB model is pre-shrunk while the 10GB model is compressed. 

Is our assumption reasonable? 

Anonymous
Not applicable

Hi @FWPBIDev,

 

Thank you for reaching out to Microsoft Fabric Community Forum.

 

Monitor Capacity Utilization, Utilize the Power BI Premium Capacity Metrics app to monitor metrics such as CPU and memory usage, query durations, and query counts across both workspaces. Compare these metrics before and after the migration to LSM.

https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app-install?tabs=1st 
https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app 

 

This application will enable you to directly compare the metrics between the two workspaces (SSM vs. LSM) and quantify the impact of the transition on your capacity.

 

If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it!

 

Regards,

Vinay Pabbu

Thank you, Vinay. 

Anonymous
Not applicable

Hi @FWPBIDev,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.

 

Regards,
Vinay Pabbu

lbendlin
Super User
Super User

This depends. It depends on a lot (A LOT) of factors, not the least being the compressibility of your data, including in which sort order the original data is presented.

 

I don't think you will have much of a choice as the SSM will most likely be sunset rather sooner than later.

Interesting. Wasn't aware SSM will sunset soon. Sounds like I may be spinning my wheels then and spending time that would be better devoted to capacity utilization monitoring. We're willing to increase capacity but like others don't want to just jump without exploring other possible optimization alternatives. 

Man, this makes me realize how much I love just writing DAX! Thanks for the reply.

And you haven't even dabbled in query caching yet!

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.