Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hi!
I don't have an actual "solve this for me" type of question. More just looking for info. We expect to have a number of different semantic models such as "sales", "finance", "inventory", etc. which recycle a number of the same facts and dims into different models. Instead of trying to maintain individual semantic models for each, we're looking at developing a "master model" that will contain every fact and dimension and then deploy only those facts and dims that should be included in a given sub-model.
Done a little bit of Googling and the best I've found are these two items:
If anyone is aware of other resources that could provide some more insight into the what and how of Master Models, I would certainly appreciate the assistance. Thank you!
Solved! Go to Solution.
Also known as "Golden dataset". Mostly an exercise in futility, especially in larger organizations. cf "Expanded Tables"
I agree here that quite often, while it seems like it is great to create a golden data set in reality, this often doesn't happen as expected due to the complexities and different business areas. I would recommend trying to do one area at a time, and once you've got a handle on all the different areas over time, then see if it is possible to build a golden data set.
I agree here that quite often, while it seems like it is great to create a golden data set in reality, this often doesn't happen as expected due to the complexities and different business areas. I would recommend trying to do one area at a time, and once you've got a handle on all the different areas over time, then see if it is possible to build a golden data set.
I'd think this would be very difficult if pulling data from multiple source systems with different grain, keys, etc. But all the data feeding into the master model would be sourced from the enterprise data warehouse. I'd think that all the complexities with trying to get data to work well together would have been solved during ingestion into EDW (maybe?).
The master model (along with the subs) would almost be like "wrappers" so users could access the data from EDW with measures and relationships already defined for them, etc. Theoretically, in practice it would be very similar to MDX cubes.
Also known as "Golden dataset". Mostly an exercise in futility, especially in larger organizations. cf "Expanded Tables"
Look at it from both sides
Microsoft is vastly underestimating the complexity of any large enterprise organization and the insane amount of silos.
Large Enterprises are unwilling or unable to enforce best practices around data stewardship and data warehouse design and to reduce the insane amount of silos.
But if data stewardship and sound EDW design were in place...? Like the semantic models were almost the equivalent of an MDX cube pulling data from EDW it might work?
Ahahaha, good one... Yes, that would be great.
I see what you did there!!! 😅
Developing a master model that encompasses all shared facts and dimensions, and then selectively deploying subsets of these into specific sub-models is a smart approach.
Here are some links for reference:
Universal and Timeless Database Design Patterns for 2024 and Beyond | Vertabelo Database Modeler
Tabular Optimization (elegantbi.com)
Design and build tabular models - Training | Microsoft Learn
Best Regards
Zhengdong Xu
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.