- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Semantic model architecture best practise
Hi guys,
i joined a team that is very SQL heavy.
Basically, they have a SQL query that runs in a stored procedure building a table in the format or structure needed.
PowerBi then uses this table as its source
if there are 50 reports, a table is built for each one. Using the same data presented differently.
I think this was just because DAX skills were limited at the time.
Another reason was because the gateway was failing refreshing 50 models off the same source.
My thought is to have 1 semantic model with the data and all the reports using that model and any presentation changes done using DAX.
are there any PROS and CONS
PROS
only 1 model to refresh taking pressure off gateway
less technical debt.
CONS
performance hampered by 50 reports/users accessing same semantic model at once?
any other suggestions?
cheers
Ian
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi @IanCockcroft , if your tables are pre-built in SQL and the only thing you need to do is a simple select, this should not overload the gateway. However, trying to aggregate is not a bad idea, especially if in every report you need to rebuild the same DAX etc...
It really depends on how complex the semantic model will be as well as how big it will be.
Remember that with Power BI Pro you have 1GB for Semantic Model, 10GB with Power BI Premium.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hello @IanCockcroft.
Here's my view:
Pros: Having one semantic model means less maintenance, fewer refresh failures, and a single source of truth for all reports. It also takes the load off the gateway since only one model needs to refresh, improving reliability (also using DAX instead of SQL for data shaping gives more flexibility)
Cons: With many users and reports pulling data from the same model, performance can slow down, especially during heavy usage. Since the team is SQL-heavy, learning DAX might be a challenge, requiring training. In some cases, having a few smaller models for specific reports could help keep things running smoothly.
If performance becomes an issue with a single model, consider one core model for common data.
Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos" 🙂
Kind Regards,
Sahir Maharaj
Data Scientist | Data Engineer | Data Analyst | AI Engineer
P.S. Want me to build your Power BI solution? (Yes, its FREE!)
➤ Lets connect on LinkedIn: Join my network of 15K+ professionals
➤ Join my free newsletter: Data Driven: From 0 to 100
➤ Website: https://sahirmaharaj.com
➤ Email: sahir@sahirmaharaj.com
➤ Want me to build your Power BI solution? Lets chat about how I can assist!
➤ Join my Medium community of 30K readers! Sharing my knowledge about data science and artificial intelligence
➤ Explore my latest project (350K+ views): Wordlit.net
➤ 100+ FREE Power BI Themes: Download Now
LinkedIn Top Voice in Artificial Intelligence, Data Science and Machine Learning
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi @IanCockcroft
I think the best way to go is to consolidate everything into a single semantic model. This way, all your reports can be pulled from the same data, and any formatting changes can be done with DAX. It’ll help cut down on repetition and save a lot of time in the long run since you won’t need separate tables for every single report. It also makes it easier to update things in the future, and you’ll be working with one central source of truth.
From a performance point of view, having just one model to refresh is way better than refreshing separate semantic models for each report as It will save a lot of time. It’ll take the pressure off the gateway and reduce the chance of refresh failures since you’re only dealing with one model instead of 50. Plus, DAX gives us more flexibility compared to SQL, especially for calculations and transformations, so you can handle that directly in Power BI without needing to rely on SQL as much.
However, please make sure the model's size and complexity. If the dataset grows too big or if you have a ton of people hitting the reports at once, it could cause some slowdowns. In that case, you might need to think about aggregating data or organizing it better to avoid overloading the system.
Going for one unified semantic model and doing the transformations in DAX is the best route. It’ll make your life easier in terms of maintenance, help performance, and make report updates more straightforward.
To set up this, you can simply follow the below steps:
Step 1: Create a unified semantic model –
- Get data/tables from SQL as per your required scenario.
- Perform the data cleaning/massaging (Transformation) in Power Query.
- Create the data modelling in model view.
- Add required calculated tables, calculated columns and measures
Step 2: Once, ETL is done in semantic model. You can publish it to Power BI service.
Step 3: Create reports on top of it
- Create a blank report, click on get data and choose semantic model/ Power BI dataset to fetch the tables/model/defined items (calculated tables, calculated columns and measures) from the deployed semantic model.
- After creating the report and adding the required visuals you can publish it to Power BI service
NOTE: You just need to refresh the semantic model and all the reports which are created on top of it will be refreshed automatically.
For better performance, you can ingest the SQL data in a Power BI dataflow(also, we can say online power query) fetch the transformed tables directly into the semantic model. Now, you just need to add the data modelling part, create calculated column/measures as per the required scenario.
And then other processes will be same after deploying the semantic model in Power BI service.
NOTE: In this case, you first need to refresh the dataflow then semantic model.
Best regards,
Ray Minds
http://www.rayminds.com
https://www.linkedin.com/company/rayminds/If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi @IanCockcroft
I think the best way to go is to consolidate everything into a single semantic model. This way, all your reports can be pulled from the same data, and any formatting changes can be done with DAX. It’ll help cut down on repetition and save a lot of time in the long run since you won’t need separate tables for every single report. It also makes it easier to update things in the future, and you’ll be working with one central source of truth.
From a performance point of view, having just one model to refresh is way better than refreshing separate semantic models for each report as It will save a lot of time. It’ll take the pressure off the gateway and reduce the chance of refresh failures since you’re only dealing with one model instead of 50. Plus, DAX gives us more flexibility compared to SQL, especially for calculations and transformations, so you can handle that directly in Power BI without needing to rely on SQL as much.
However, please make sure the model's size and complexity. If the dataset grows too big or if you have a ton of people hitting the reports at once, it could cause some slowdowns. In that case, you might need to think about aggregating data or organizing it better to avoid overloading the system.
Going for one unified semantic model and doing the transformations in DAX is the best route. It’ll make your life easier in terms of maintenance, help performance, and make report updates more straightforward.
To set up this, you can simply follow the below steps:
Step 1: Create a unified semantic model –
- Get data/tables from SQL as per your required scenario.
- Perform the data cleaning/massaging (Transformation) in Power Query.
- Create the data modelling in model view.
- Add required calculated tables, calculated columns and measures
Step 2: Once, ETL is done in semantic model. You can publish it to Power BI service.
Step 3: Create reports on top of it
- Create a blank report, click on get data and choose semantic model/ Power BI dataset to fetch the tables/model/defined items (calculated tables, calculated columns and measures) from the deployed semantic model.
- After creating the report and adding the required visuals you can publish it to Power BI service
NOTE: You just need to refresh the semantic model and all the reports which are created on top of it will be refreshed automatically.
For better performance, you can ingest the SQL data in a Power BI dataflow(also, we can say online power query) fetch the transformed tables directly into the semantic model. Now, you just need to add the data modelling part, create calculated column/measures as per the required scenario.
And then other processes will be same after deploying the semantic model in Power BI service.
NOTE: In this case, you first need to refresh the dataflow then semantic model.
Best regards,
Ray Minds
http://www.rayminds.com
https://www.linkedin.com/company/rayminds/If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hello @IanCockcroft.
Here's my view:
Pros: Having one semantic model means less maintenance, fewer refresh failures, and a single source of truth for all reports. It also takes the load off the gateway since only one model needs to refresh, improving reliability (also using DAX instead of SQL for data shaping gives more flexibility)
Cons: With many users and reports pulling data from the same model, performance can slow down, especially during heavy usage. Since the team is SQL-heavy, learning DAX might be a challenge, requiring training. In some cases, having a few smaller models for specific reports could help keep things running smoothly.
If performance becomes an issue with a single model, consider one core model for common data.
Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos" 🙂
Kind Regards,
Sahir Maharaj
Data Scientist | Data Engineer | Data Analyst | AI Engineer
P.S. Want me to build your Power BI solution? (Yes, its FREE!)
➤ Lets connect on LinkedIn: Join my network of 15K+ professionals
➤ Join my free newsletter: Data Driven: From 0 to 100
➤ Website: https://sahirmaharaj.com
➤ Email: sahir@sahirmaharaj.com
➤ Want me to build your Power BI solution? Lets chat about how I can assist!
➤ Join my Medium community of 30K readers! Sharing my knowledge about data science and artificial intelligence
➤ Explore my latest project (350K+ views): Wordlit.net
➤ 100+ FREE Power BI Themes: Download Now
LinkedIn Top Voice in Artificial Intelligence, Data Science and Machine Learning
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

Hi @IanCockcroft , if your tables are pre-built in SQL and the only thing you need to do is a simple select, this should not overload the gateway. However, trying to aggregate is not a bad idea, especially if in every report you need to rebuild the same DAX etc...
It really depends on how complex the semantic model will be as well as how big it will be.
Remember that with Power BI Pro you have 1GB for Semantic Model, 10GB with Power BI Premium.

Helpful resources
Join us at the Microsoft Fabric Community Conference
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Power BI Monthly Update - February 2025
Check out the February 2025 Power BI update to learn about new features.
