Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Score big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount

Reply
RonRicotta
Frequent Visitor

Dynamic Report Data Source

I manage over 20 clients, each with their own database. All databases share the same structure. I’ve connected each database to a gateway. I’m considering storing this data in data warehouses. I don’t want to create over 20 separate reports because if I make a change to one, I don’t want to have to update all 20+ reports individually. Is it possible to create reports that dynamically change the data source based on the user’s role or group?

1 ACCEPTED SOLUTION
RonRicotta
Frequent Visitor

For those who are looking to do the same thing. I think I have figured it out.

I'm using multiple Dataflow Gen2s (one for each database) using the dataflow, I'm adding new columns for the primary and foreign keys (<database name>_<key name>). I connect all of the Dataflows to 1 Warehouse and store everything in the same tables. I've set one dataflow on 'replace' to clear the warehouse database with the new data. All the other databases are set to 'append' to add the data.

Then I've created a pipeline that first runs the dataflow with the replace and when that is successful, it runs the dataflows with append.

I also created a seperate Semantic Model from the warehouse to make use of the roles feature. The wareshouses default models doens't allow you to. There I created 2 roles. One called 'Full access' with no limitations and one called 'Limited Access' with the DAX query '[Email] == USERPRINCIPALNAME()' on the table where I store my users emails. This will make sure the user only has access to the data connected to the logged in user. (Make sure to create the relations on the new model, otherwise it won't work.)

View solution in original post

4 REPLIES 4
RonRicotta
Frequent Visitor

For those who are looking to do the same thing. I think I have figured it out.

I'm using multiple Dataflow Gen2s (one for each database) using the dataflow, I'm adding new columns for the primary and foreign keys (<database name>_<key name>). I connect all of the Dataflows to 1 Warehouse and store everything in the same tables. I've set one dataflow on 'replace' to clear the warehouse database with the new data. All the other databases are set to 'append' to add the data.

Then I've created a pipeline that first runs the dataflow with the replace and when that is successful, it runs the dataflows with append.

I also created a seperate Semantic Model from the warehouse to make use of the roles feature. The wareshouses default models doens't allow you to. There I created 2 roles. One called 'Full access' with no limitations and one called 'Limited Access' with the DAX query '[Email] == USERPRINCIPALNAME()' on the table where I store my users emails. This will make sure the user only has access to the data connected to the logged in user. (Make sure to create the relations on the new model, otherwise it won't work.)

Tutu_in_YYC
Super User
Super User

This is possible if you combine all the 20+ client data into 1 semantic model, and then apply RLS on the semantic model based on the username i.e when A@companyA.com logs in, A will only see CompanyA data.

https://radacad.com/dynamic-row-level-security-with-power-bi-made-simple

Thank you for your anwer. We are using integers as primary keys. Wouldn't this cause collisions if you merged everything into one semantic model?

 

How do you suggest we handle this? Looking forward to your thoughts.

A surrogate key can be created in Power Query. But the solution that you came up with good too!

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.