Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
What started as a reasonable size model has grown significantly over the year. It now takes 1hr to refresh/build due to the number of transformations, which is not a problem for the nighly refresh, but really slows down my development productivity
Here is the architcture.... The data is containd in several excel files (think a dozen different formats - multiple files of each format) and 50+ sharepoint lists.
This data is read into data flows. The data flows are read into a PBI semantic model. Various transformations are done to create a core model (think 10 facts, 100 dimensions, snowflake schema). This core model is used to generate 1 report used by the executives. This worked fine.
Over the year, as new interest grew, the number of business questions asked has increaesd, I have spun off another 50 queries to answer specific questions (with aggregations and groupings and transformations from the core model). These 50 queries are used to generate 10 reports for 10 different operational audiences.
The core model is at the foundation for the 10 different reports. I don't want to repeat that logic in the different reports and I don't want to do it in a data flow if I can avoid it, because the logic is very complex. Yet, having the core model and the child models for each of the operational report has really bloated the model and makes it difficult for me to use (again, there is no lag for the user... this is all about making it easier for me to do my work)...
Is it possible to have one semantic model depend on another? Is it possible to copy/reference queries used in core model and transform them in child model so that I don't need to have all transformations in the same model? How do I accomplish this? Thank you for your guidance.
Solved! Go to Solution.
Hi @Steph439 ,
Totally understand where you're coming from. We've all been there. One approach that might help. Think of your semantic model as the place to host your core logic and complex measures, especially the ones that are hard to maintain across multiple reports. Keep those in a single semantic model.
Then, based on the report area (users' requirements), break them into smaller, lightweight reports that connect back to this central model. You can build any report-specific measures there, which helps distribute the processing load and keeps things more modular.
Now, about your main concern , the Power BI report taking over an hour to refresh. Good job you are already using dataflows! To make things more efficient, you might consider organizing your dataflows like this:
Bronze Layer: Ingest raw data from sources like Excel files or SharePoint lists.
Silver Layer: Handle the core transformation logic — sounds like this is your current dataflow model.
Gold Layer: Create report-specific dataflows that reference the Silver Layer — keeping logic targeted and clean.
Presentation Layer: Build thin semantic models here, just focusing on relationships and final measures.
This kind of layered approach gives you reusability, better performance, and makes everything easier to manage over time.
Find this helpful? ✔ Give a Kudo • Mark as Solution – help others too!
Although I was trying to avoid converting the model into a series of data flows, that is what I have done. I already had the bronze layer. now Im introducing the silver layer.... Thanks for pushing me in the right direction
- Steph
Hi @Steph439 ,
Totally understand where you're coming from. We've all been there. One approach that might help. Think of your semantic model as the place to host your core logic and complex measures, especially the ones that are hard to maintain across multiple reports. Keep those in a single semantic model.
Then, based on the report area (users' requirements), break them into smaller, lightweight reports that connect back to this central model. You can build any report-specific measures there, which helps distribute the processing load and keeps things more modular.
Now, about your main concern , the Power BI report taking over an hour to refresh. Good job you are already using dataflows! To make things more efficient, you might consider organizing your dataflows like this:
Bronze Layer: Ingest raw data from sources like Excel files or SharePoint lists.
Silver Layer: Handle the core transformation logic — sounds like this is your current dataflow model.
Gold Layer: Create report-specific dataflows that reference the Silver Layer — keeping logic targeted and clean.
Presentation Layer: Build thin semantic models here, just focusing on relationships and final measures.
This kind of layered approach gives you reusability, better performance, and makes everything easier to manage over time.
Find this helpful? ✔ Give a Kudo • Mark as Solution – help others too!
Although I was trying to avoid converting the model into a series of data flows, that is what I have done. I already had the bronze layer. now Im introducing the silver layer.... Thanks for pushing me in the right direction
- Steph
Composite models are an option (having semantic models depend on other semantic models sometimes called "golden datasets") but they come with a lot of baggage
- dependencies you may not want (upstream RLS, for example)
- users need to have access to all participating models
- chaining limited to three deep
- join cardinality limitations (any dimension over 50K rows is problematic at best and catastrophic at worst)
Kudos for trying to eliminate dataflows from the equation - always follow the maxim of Occam's Razor when making architecture design decisions.
You bring up the point of the business questions - that is very, very important. A data model needs to follow the business question, not the other way round. And yes, that means that sometimes the answer is to have separate , not-quite-duplicate data models.
I'm already using composite models and blending data between them... I went with the data flows to leverage the schedule/speed/parallel activity possible on the server. Thanks for the suggestion
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
Check out the November 2025 Power BI update to learn about new features.
| User | Count |
|---|---|
| 59 | |
| 46 | |
| 42 | |
| 23 | |
| 17 |
| User | Count |
|---|---|
| 190 | |
| 122 | |
| 96 | |
| 66 | |
| 47 |