Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hi guys,
i joined a team that is very SQL heavy.
Basically, they have a SQL query that runs in a stored procedure building a table in the format or structure needed.
PowerBi then uses this table as its source
if there are 50 reports, a table is built for each one. Using the same data presented differently.
I think this was just because DAX skills were limited at the time.
Another reason was because the gateway was failing refreshing 50 models off the same source.
My thought is to have 1 semantic model with the data and all the reports using that model and any presentation changes done using DAX.
are there any PROS and CONS
PROS
only 1 model to refresh taking pressure off gateway
less technical debt.
CONS
performance hampered by 50 reports/users accessing same semantic model at once?
any other suggestions?
cheers
Ian
Solved! Go to Solution.
Hi @IanCockcroft , if your tables are pre-built in SQL and the only thing you need to do is a simple select, this should not overload the gateway. However, trying to aggregate is not a bad idea, especially if in every report you need to rebuild the same DAX etc...
It really depends on how complex the semantic model will be as well as how big it will be.
Remember that with Power BI Pro you have 1GB for Semantic Model, 10GB with Power BI Premium.
Hello @IanCockcroft.
Here's my view:
Pros: Having one semantic model means less maintenance, fewer refresh failures, and a single source of truth for all reports. It also takes the load off the gateway since only one model needs to refresh, improving reliability (also using DAX instead of SQL for data shaping gives more flexibility)
Cons: With many users and reports pulling data from the same model, performance can slow down, especially during heavy usage. Since the team is SQL-heavy, learning DAX might be a challenge, requiring training. In some cases, having a few smaller models for specific reports could help keep things running smoothly.
If performance becomes an issue with a single model, consider one core model for common data.
Hi @IanCockcroft
I think the best way to go is to consolidate everything into a single semantic model. This way, all your reports can be pulled from the same data, and any formatting changes can be done with DAX. It’ll help cut down on repetition and save a lot of time in the long run since you won’t need separate tables for every single report. It also makes it easier to update things in the future, and you’ll be working with one central source of truth.
From a performance point of view, having just one model to refresh is way better than refreshing separate semantic models for each report as It will save a lot of time. It’ll take the pressure off the gateway and reduce the chance of refresh failures since you’re only dealing with one model instead of 50. Plus, DAX gives us more flexibility compared to SQL, especially for calculations and transformations, so you can handle that directly in Power BI without needing to rely on SQL as much.
However, please make sure the model's size and complexity. If the dataset grows too big or if you have a ton of people hitting the reports at once, it could cause some slowdowns. In that case, you might need to think about aggregating data or organizing it better to avoid overloading the system.
Going for one unified semantic model and doing the transformations in DAX is the best route. It’ll make your life easier in terms of maintenance, help performance, and make report updates more straightforward.
To set up this, you can simply follow the below steps:
Step 1: Create a unified semantic model –
Step 2: Once, ETL is done in semantic model. You can publish it to Power BI service.
Step 3: Create reports on top of it
NOTE: You just need to refresh the semantic model and all the reports which are created on top of it will be refreshed automatically.
For better performance, you can ingest the SQL data in a Power BI dataflow(also, we can say online power query) fetch the transformed tables directly into the semantic model. Now, you just need to add the data modelling part, create calculated column/measures as per the required scenario.
And then other processes will be same after deploying the semantic model in Power BI service.
NOTE: In this case, you first need to refresh the dataflow then semantic model.
Best regards,
Ray Minds
http://www.rayminds.com
https://www.linkedin.com/company/rayminds/
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @IanCockcroft
I think the best way to go is to consolidate everything into a single semantic model. This way, all your reports can be pulled from the same data, and any formatting changes can be done with DAX. It’ll help cut down on repetition and save a lot of time in the long run since you won’t need separate tables for every single report. It also makes it easier to update things in the future, and you’ll be working with one central source of truth.
From a performance point of view, having just one model to refresh is way better than refreshing separate semantic models for each report as It will save a lot of time. It’ll take the pressure off the gateway and reduce the chance of refresh failures since you’re only dealing with one model instead of 50. Plus, DAX gives us more flexibility compared to SQL, especially for calculations and transformations, so you can handle that directly in Power BI without needing to rely on SQL as much.
However, please make sure the model's size and complexity. If the dataset grows too big or if you have a ton of people hitting the reports at once, it could cause some slowdowns. In that case, you might need to think about aggregating data or organizing it better to avoid overloading the system.
Going for one unified semantic model and doing the transformations in DAX is the best route. It’ll make your life easier in terms of maintenance, help performance, and make report updates more straightforward.
To set up this, you can simply follow the below steps:
Step 1: Create a unified semantic model –
Step 2: Once, ETL is done in semantic model. You can publish it to Power BI service.
Step 3: Create reports on top of it
NOTE: You just need to refresh the semantic model and all the reports which are created on top of it will be refreshed automatically.
For better performance, you can ingest the SQL data in a Power BI dataflow(also, we can say online power query) fetch the transformed tables directly into the semantic model. Now, you just need to add the data modelling part, create calculated column/measures as per the required scenario.
And then other processes will be same after deploying the semantic model in Power BI service.
NOTE: In this case, you first need to refresh the dataflow then semantic model.
Best regards,
Ray Minds
http://www.rayminds.com
https://www.linkedin.com/company/rayminds/
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hello @IanCockcroft.
Here's my view:
Pros: Having one semantic model means less maintenance, fewer refresh failures, and a single source of truth for all reports. It also takes the load off the gateway since only one model needs to refresh, improving reliability (also using DAX instead of SQL for data shaping gives more flexibility)
Cons: With many users and reports pulling data from the same model, performance can slow down, especially during heavy usage. Since the team is SQL-heavy, learning DAX might be a challenge, requiring training. In some cases, having a few smaller models for specific reports could help keep things running smoothly.
If performance becomes an issue with a single model, consider one core model for common data.
Hi @IanCockcroft , if your tables are pre-built in SQL and the only thing you need to do is a simple select, this should not overload the gateway. However, trying to aggregate is not a bad idea, especially if in every report you need to rebuild the same DAX etc...
It really depends on how complex the semantic model will be as well as how big it will be.
Remember that with Power BI Pro you have 1GB for Semantic Model, 10GB with Power BI Premium.
User | Count |
---|---|
84 | |
75 | |
69 | |
48 | |
39 |
User | Count |
---|---|
111 | |
56 | |
50 | |
42 | |
40 |