Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
MiquelPBI
Helper I
Helper I

Executive Semantic Model

Thank you for taking time to read the situation, background, and problem we're trying to solve. I'm eager to hear suggestions before we move to creating tech debt and persist data in a database. 

 

Situation

Our organization wants to create an executive reporting solution that collects data from other Power BI models.  The semantic models are published across different workspaces because they serve different operational domains.  The semantic model for the executive reporting solution will be published to an executive workspace. 

All the workspaces are in the same capacity.  We also use deployment pipelines.  Our workflow is to publish PBIX files to the development workspace which is plugged into a development stage in a pipeline. We use the pipeline to move artifacts from development to test and from test to production. 

 

Background

Here are the things we’ve tried.

First…

We created a live connection to each model.

The problem is the fact tables and measures from each model don’t relate to each other in a clean star schema way.  There is also RLS or OLS in some of the models that over complicates the security that needs to be set up.

Second….

We created a connection in Power Query using analysis services which runs a DAX query to import the information required for the fact table in the executive semantic model.

The problem is the service wants us to create a gateway connection for each workspace we’re connecting to.  That turns into a lot of gateways. 

Third...

Same connection approach as the second.  In addition, I created a parameter that includes a list of options: Development, Test, Production.  I created a function that creates the workspace XML endpoint depending on the option in the parameter. 

The problem is the service says it’s not refreshable b/c it’s a parameterized data connection. The service doesn’t know what the authentication method is until runtime.  Therefore, cannot configure a refresh schedule.

 

Problem

We are seeking an approach that lets us get data from operational semantic models across different workspaces into a single executive level semantic model. 

We are seeking a way to update the workspaces referenced in the executive semantic model as the model is promoted through a deployment pipeline. 

We are seeking an efficient approach, so we are not wasting capacity resources.

We are seeking an approach that can be scaled as more “executive level” reporting needs are requested. 

1 ACCEPTED SOLUTION
parry2k
Super User
Super User

@MiquelPBI can you use option 2 (DAX Queries) using semantic link labs in a notebook to write data to a lakehouse and then create an exec semantic model from lakehouse tables? You can schedule the execution of the notebook and the refresh of the semantic model using a pipeline.

 

 



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

View solution in original post

9 REPLIES 9
parry2k
Super User
Super User

@MiquelPBI Sound good. Keep us posted on how it goes. Good luck!



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

v-veshwara-msft
Community Support
Community Support

Hi @MiquelPBI ,

Thanks for reaching out to Microsoft Fabric Community.
Just checking in to see if the inputs shared address your query. If further assistance is needed, please reach out.
Thank you.

Praful_Potphode
Solution Sage
Solution Sage

Hi @MiquelPBI ,

you can go with composite models in PBI desktop, the only challenge we have there is OLS.when we combine multiple semantic models into one composite models, the RLS will be enforced from base model.however,OLS gets enforced when user tries to do drag drop from the table.so tables are visible to end user.

https://tabulareditor.com/blog/composite-models-in-power-bi-and-fabric 

 

to overcome this you can create two types of semantic model:

1. Executive Dashboard Semantic Model(reload all data,create light weighted semantic model,contains only columns used in reports creation)

2. Self Service BI Models(Separate Semantic Model for Business Users,contains all columns used by business for analysis).

THis strategy worked out well in certain cases.

 

Please give Kudos or mark it as solution once confirmed.

 

Thanks and Regards,

Praful

Thanks and Regards,

Praful

parry2k
Super User
Super User

@MiquelPBI That would have made life super easy. How large are DAX query return is going to be? Maybe run DAX queries using Power Automate and store the output somewhere in the files and then use it for the semantic model.



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

Thank you for the suggestion and I made a note.  I'm curious if there are options to accomplish our second or third attempt without putting the data somewhere outside Power BI before getting it back in Power BI.  

You can use direct query connection to the semantic models but RLS and OLS will be inherited so if the users do not have the right permission or don't have access to the semantic models, they won't be able to see the contents of the reports. So I think as what @parry2k has suggested, storing the result of a DAX query outside Power BI is your only option. Aside from running a DAX query with Power Automate, you can also export the data from a visual that isn't in a live connection report and select the option to connect live to the semantic model. Store the Excel file in a private SharePoint folder. You can automate the refresh by creating a scheduled Power Automate flow that adds a row to a table in the same Excel file as the live connection export. You can have as many live connection exports in the same file so you only need one flow to  refresh the queries. The caveat is the flow won't fire if there are multiple users opening the file at the same time

danextian_0-1763011213702.png

 





Dane Belarmino | Microsoft MVP | Proud to be a Super User!

Did I answer your question? Mark my post as a solution!


"Tell me and I’ll forget; show me and I may remember; involve me and I’ll understand."
Need Power BI consultation, get in touch with me on LinkedIn or hire me on UpWork.
Learn with me on YouTube @DAXJutsu or follow my page on Facebook @DAXJutsuPBI.
parry2k
Super User
Super User

@MiquelPBI can you use option 2 (DAX Queries) using semantic link labs in a notebook to write data to a lakehouse and then create an exec semantic model from lakehouse tables? You can schedule the execution of the notebook and the refresh of the semantic model using a pipeline.

 

 



Subscribe to the @PowerBIHowTo YT channel for an upcoming video on List and Record functions in Power Query!!

Learn Power BI and Fabric - subscribe to our YT channel - Click here: @PowerBIHowTo

If my solution proved useful, I'd be delighted to receive Kudos. When you put effort into asking a question, it's equally thoughtful to acknowledge and give Kudos to the individual who helped you solve the problem. It's a small gesture that shows appreciation and encouragement! ❤


Did I answer your question? Mark my post as a solution. Proud to be a Super User! Appreciate your Kudos 🙂
Feel free to email me with any of your BI needs.

We will design an option similar to this suggestion. 

Thank you for the reply  @parry2k . We are not in a Fabric SKU. We have plans to migrate but other Fabric items are not an option at this time. 

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.