Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Im a business analyst for a multi-national company. My role is based under a dedicated country and division. Even though we have central teams supporting all countries and departments, we are working for the mostly in "silos" and there are not much communication between peers.
My team supports multiple departments within our "silo". We have been using PowerBI for 3-4 years now and its been self-though by us, that means that the whole architecture and setup is a big patchwork. Most of the data is currently served by our central BI team through a SQL-database. We do not control write priviliges for the views, so we have channeled most of the data through Power BI dataflows where we join, append and clean up the data before using it further. This is also to prevent overloading the SQL-server (that are being used by all departments and all countries - specially in the mornings).
So I guess we have 10-20 flows with mutliple "tables" that we have tried to organize by data type and data source.
These flows are supporting mainly 4 huge datasets - 3 are departments specifics and the fourth is a "umbrella" that tries to capture the most important data cross all departments. This means that a lot of the same data is stored in the different datasets (bad practice). In addition to this, we are having mutliple reports connecting to the different datasets. This to show different data but in many cases showing the same data in different reports to controll access. Most of the measures is also written directly in the reports, that makes changing KPI definitions etc needs to be done multiple places. So to summarize, its a spagetti mess where "one single source of truth" is no where to be seen.
The company now has decided to move away from the SQL server we currently have and move most of the data to Snowflake. This will improve the capacity imeansly. Our local SQL person has plans to rewrite most of the views to optimize them. This means that we have to do changes in our setup. and I'm thinking to maybe just to a complete makeover of the current setup. And here is my question. How should we structure the solution?
With Snowflake in mind, is it advisable to use dataflows to "pre-load" data into PowerBI before creating the dataset? What about a Datamart, or is it better to create more traditional datasets? Should we have one enormous dataset, or should we create reports that connect to multiple datasets using DirectQuery? How can we have a solution that are built for our needs, but also supports sharing information to other departments (and sharing only what they need).
Thanks in advance!
Datamart would have been an answer of sorts a while back but now it is all but abandoned in favor of the new Fabric offerings. Recommend you check those out.
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!