Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hey all,
I've been reading about dataflows lately and I am wondering if it could really benefit our BI-structure. First I will explain shortly how the current structure looks like, and how I "hope" that dataflows can benefit.
Current structure
We have designed a program which is being used by more than 20 customers (and growing). I created a report based on the data of our program to create insights for our customers. To show them how effectively they are using it. I have one PBIX file (like a template) with all the transformations and parameters that are needed to load data and create the report. Every time when I release a update, or when I need to launch the report for a new customer, I will open that template report and fill in the required parameters (SQL instance, customerID, CustomerName etc.). I will gather the data and then publish the report to the existing or new customer. I will have to do this 20 times when I release a new update for the existing dashboard. This can take a long long time because some customers contain a lot of data. Another problem is that the pbix file report can get performance issues when loading in big amounts of data from certain customers.
Now I thought I was handling this pretty well (your opinion is more than welcome btw). But with dataflows I get the feeling that I can do the ETL process within the dataflow, and connect to that dataflow within my template file. So that all that required performance is not needed anymore in the same file. That way, I can focus on building the model/measures and the report within the pbix file.
I have created a dataflow and edited my existing queries in the template file to connect to the dataflow. But now I am wondering how can I change parameters to switch between customers to publish the report containing their data?
OR, is dataflow actually more for centralizing lookup tables, like datecalendar, or if you use the same customerlist within the organization?
Thanks for your help. Any thoughts/recommendations on my current power bi structure is also more than welcome.
hi @DeBIe ,
If all of the data is teh saem - but only broken down by customer then my thinking is that you do all of the data and as much of the ETL as you can in the dataflow. Dataflows can be quite powerful and I use them all of the time (here is an articl with some "when to use" information) Introduction to dataflows and self-service data prep - Power BI | Microsoft Learn.
Once you have your dataflow as close as you can get it to the final result I would then create a desktop report that usess the Dataflow and I would creates any additional ETL type functions in that file that I could not do in Dataflow. This allows the "heavy lift" to be in the Dataflow.
Finally, I would use RLS to differentiate between my customers as much as I could. That way I don't have to have multiple reports. After setting up the RLS that should allow you to have a very few number of reports (hopefully just one) to update and reconfigure as needed and then share to customers based on your RLS settings.
Proud to be a Datanaut!
Private message me for consulting or training needs.
Hey @collinq
Thank you very much for your feedback.
I think I understand the architecture that you are trying to explain. I am going to test this with 2 customers and see if it could work. I will let you know.
Thoughts/feedback anyone 😄 ?
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!