The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
We have a semantic model approaching 10GB and growing. It's been in small semantic model format since initial development. This report utilizes RLS and hundreds of subscriptions. We get irregular spikes (surmising due certain subscriptions kicking off since this is what we've traced back to so far, but it's not a definitive conclusion yet).
We made a copy of the same model and report and placed them in another P2 workspace (both workspaces are P2) but converted the model to large semantic model format. Along with continued troubleshooting of our spikes, we'd like to quantify to some degree the performance gains from a capacity utilization standpoint we've received moving to LSM format.
Does anyone have any idea how best to achieve this? Looking for any and all suggestions. Thanks in advance.
Solved! Go to Solution.
Hi @FWPBIDev
To quantify the performance gains from moving a Power BI semantic model from a small semantic model (SSM) format to a large semantic model (LSM) format, especially in terms of capacity utilization and query performance, there are several steps you can take. First, you should leverage the Power BI Admin Portal to monitor key capacity metrics such as CPU usage, memory consumption, and query performance in both workspaces. By comparing metrics like query duration and response times before and after switching to the LSM, you can directly assess the impact of the model format change. Additionally, using tools like Power BI Query Diagnostics allows you to break down the time spent in different phases of query execution, helping identify any performance bottlenecks. For a more structured approach, consider running load tests to simulate real-world usage, especially focusing on scenarios involving hundreds of subscriptions, which are suspected to cause performance spikes. By triggering the same set of subscriptions in both the SSM and LSM environments, you can track query execution times and system resource utilization to measure the difference in handling high-demand scenarios. Another key area to explore is analyzing subscription activity, which could give insights into whether the spikes are caused by specific report refreshes or concurrent subscriptions. The Power BI Premium Capacity Metrics App can further help track resource consumption in your P2 capacity, providing a more granular view of resource utilization, and allowing you to correlate usage patterns with report activity. Additionally, it’s important to gather feedback from end-users regarding report responsiveness, as subjective user experience can complement the data-driven analysis. By following these steps, you'll be able to quantify performance improvements in terms of reduced resource consumption, better handling of concurrent loads, and overall system efficiency after migrating to the LSM format.
Hi @FWPBIDev ,
To quantify performance gains from LSM vs. SSM in P2 capacity:
This will highlight LSM's efficiency in handling larger models and concurrent processes.
Hi @FWPBIDev
To quantify the performance gains from moving a Power BI semantic model from a small semantic model (SSM) format to a large semantic model (LSM) format, especially in terms of capacity utilization and query performance, there are several steps you can take. First, you should leverage the Power BI Admin Portal to monitor key capacity metrics such as CPU usage, memory consumption, and query performance in both workspaces. By comparing metrics like query duration and response times before and after switching to the LSM, you can directly assess the impact of the model format change. Additionally, using tools like Power BI Query Diagnostics allows you to break down the time spent in different phases of query execution, helping identify any performance bottlenecks. For a more structured approach, consider running load tests to simulate real-world usage, especially focusing on scenarios involving hundreds of subscriptions, which are suspected to cause performance spikes. By triggering the same set of subscriptions in both the SSM and LSM environments, you can track query execution times and system resource utilization to measure the difference in handling high-demand scenarios. Another key area to explore is analyzing subscription activity, which could give insights into whether the spikes are caused by specific report refreshes or concurrent subscriptions. The Power BI Premium Capacity Metrics App can further help track resource consumption in your P2 capacity, providing a more granular view of resource utilization, and allowing you to correlate usage patterns with report activity. Additionally, it’s important to gather feedback from end-users regarding report responsiveness, as subjective user experience can complement the data-driven analysis. By following these steps, you'll be able to quantify performance improvements in terms of reduced resource consumption, better handling of concurrent loads, and overall system efficiency after migrating to the LSM format.
Thank you very much
So on the model we've been using that's under SSM format, the model size is just south of 10GB. The model we copied to the secondary workspace and opted for LSM is now showing 22GB model size.
We're thinking the 22GB model is pre-shrunk while the 10GB model is compressed.
Is our assumption reasonable?
Hi @FWPBIDev,
Thank you for reaching out to Microsoft Fabric Community Forum.
Monitor Capacity Utilization, Utilize the Power BI Premium Capacity Metrics app to monitor metrics such as CPU and memory usage, query durations, and query counts across both workspaces. Compare these metrics before and after the migration to LSM.
https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app-install?tabs=1st
https://learn.microsoft.com/en-us/fabric/enterprise/metrics-app
This application will enable you to directly compare the metrics between the two workspaces (SSM vs. LSM) and quantify the impact of the transition on your capacity.
If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it!
Regards,
Vinay Pabbu
Thank you, Vinay.
Hi @FWPBIDev,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Vinay Pabbu
This depends. It depends on a lot (A LOT) of factors, not the least being the compressibility of your data, including in which sort order the original data is presented.
I don't think you will have much of a choice as the SSM will most likely be sunset rather sooner than later.
Interesting. Wasn't aware SSM will sunset soon. Sounds like I may be spinning my wheels then and spending time that would be better devoted to capacity utilization monitoring. We're willing to increase capacity but like others don't want to just jump without exploring other possible optimization alternatives.
Man, this makes me realize how much I love just writing DAX! Thanks for the reply.
And you haven't even dabbled in query caching yet!