Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
This might seem like a basic question, but I am curious about when to use Power BI data flows, datamarts, or just stick with the semantic data model by itself. For the last eight years I have been developing Power BI dashboards perfectly and have had no issue building everything I need with Power BI desktop and publishing it to our power platform. Inside my semantic data model or “tabular data model” I had the creative freedom of building out all the tables needed for my model and within each table inside of Power BI desktop I had the ability to use power query to connect and transform data into the respective tables and then model it with relationships in my dataset. This semantic model that I have been creating for years already had data flows or power query capabilities within, and the ability to create tables. I never had a need to create a “separate data flow” outside of my semantic model (in the cloud) to prep the data since all of my prep work was done inside the semantic model behind the scenes of each table in the pbix file. Since upgrading to premium capacity in our power platform I can now see there are data flows and datamart options now, but I'm curious why I would even need either of these since power query is already baked into the semantic model/dataset and why I would need datamarts since I am already querying information from disparate data sources into tables of my tabular data model? I guess what I am asking is it seems like duplicative work in order to build a dataflow outside of my semantic model only to feed into a datamart that will then push to a semantic model when I already have ETL and table creation capability within my semantic model by itself....
Please help explain the differences and why I would need to use either a dataflow or a datamart when for years I have already had these tools as part of the semantic model component. Also, I used to develop these tabular data models with Visual Studio and deploy to analysis services and even in Visual Studio I had the capability to use power query in Visual Studio and never had a need for ETL solutions because I was able to “prep” my data inside of the semantic model or BIM file and could use Tabular Editor to connect and update the metadata. In addition, given that now my semantic model is hosted in a "Premium Workspace" other users are able to use my workspace URL and connect into the semantic model using an analysis services connection to pull in the tables needed for other reporting needs.
Just curious where data flow by itself or datamart by itself makes sense since both of these seemed to already exist in a semantic model development within power BI desktop. Power query is already accessible from Power BI desktop why would I need to create a data flow outside of my tabular data model when that feature is already enabled with the application?
Given that I am just building data models and dashboards in Power BI desktop, does it ever make sense to use a datamart or cloud data flow to pull this information in when the data sources that I'm connecting to currently are various OnPrem, cloud based systems, and flat files…?
Does it make a difference or is it best practice and faster to do everything I described a different way? Why use datamarts at all if I can prep and store everything in a semantic data model OR EVEN prep and transform everything in a dataflow and have the data flow connect directly to the semantic model without ever needing a datamart as the middleman...?? Any thoughts or recommendations or best practice tips would be greatly appreciated here…
Thanks!
T
Solved! Go to Solution.
Hey @OptalyticBI ,
you will not miss "best practices" when you stick to your working method. You extract, transform, and load data, then you create your model and fínally you visualize the data and share the report with your colleagues. In our organization there are many teams developing Power BI solutions targetting large audiences, there are many teams creating apps for their departments, or sections, we have ~2.4k Pro licences installed.
Regarding dataflows
We recommend everyone who is asking (not every one does, and no one has to) to use dataflows. The reasoning behind this is the different compute architecture, dataflows are performing way much faster then Power Query queries. With every compute second safed we safe money because we do not have to upgrade our capacities, at least not today.
Regarding datamarts
Currently we do not use datamarts, and most probably we never will. With the advent of Microsoft Fabric there are other means to share "raw" data (not the semantic model and also not the data visualizations) with other teams or external guest users. A datamart or a SQL endpoint as we now call it with Microsoft Fabric makes data more accessible, data can be shared with a larger audience, because the lingua franca of data is SQL not python, and "unfortunately" not DAX.
From my perspective, you do not need to change your working method, but you have to keep an eye on
Hopefully, this helps to make up your mind.
Regards,
Tom
Hi @OptalyticBI ,
Another benefit of dataflows is that they use PBI service credentials to connect to your data sources. It might be critical in some configurations. For example, user credentials might work from PBI Desktop, but not from PBI Service, hence, the dataset refresh won't happen.
Much appreciated. I figured that was the case because I could not figure out what the value of having a datamart is if I can connect to everything in the Power Platform with data flows. My Semantic Model can connect directly to a data flow in Power BI, and in an Excel file can connect to data flows. so the need for a datamart does not serve any new purpose other than if I need to share information with developers that like to use sql server management studio and will need to provide them a sql connection to do any sort of query against the data. The only value I see in using data flows is if I need to schedule different data calls at different times of the day independent of eachother... otherwise, it makes the most sense simply to build and connect everything together only with the semantic data model if I am pulling everything together at once. Thanks!
Hi @OptalyticBI
Based on your description it sounds like you are the only developper in your organization who is building the semantic models(Source of truth) to be reused by other Devs. If so then keep on the good work. It wouldn't be a good option anyway before upgrading to Premium Capacity because refresh is not possible for 10+ dataflows across workspaces with a Pro Licenses(Shared WS). Though since moving to Premium capacity then you might reconsider other alternatives and scenarios.
In addition to what you mentioned Dataflows are stored in ADLS Gen 2 which is accessible by all Power Platform family if you wanted to build Apps,Flows and AI capabilities.
Here is more in details :
https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-introduction-self-ser...
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
Hey @OptalyticBI ,
you will not miss "best practices" when you stick to your working method. You extract, transform, and load data, then you create your model and fínally you visualize the data and share the report with your colleagues. In our organization there are many teams developing Power BI solutions targetting large audiences, there are many teams creating apps for their departments, or sections, we have ~2.4k Pro licences installed.
Regarding dataflows
We recommend everyone who is asking (not every one does, and no one has to) to use dataflows. The reasoning behind this is the different compute architecture, dataflows are performing way much faster then Power Query queries. With every compute second safed we safe money because we do not have to upgrade our capacities, at least not today.
Regarding datamarts
Currently we do not use datamarts, and most probably we never will. With the advent of Microsoft Fabric there are other means to share "raw" data (not the semantic model and also not the data visualizations) with other teams or external guest users. A datamart or a SQL endpoint as we now call it with Microsoft Fabric makes data more accessible, data can be shared with a larger audience, because the lingua franca of data is SQL not python, and "unfortunately" not DAX.
From my perspective, you do not need to change your working method, but you have to keep an eye on
Hopefully, this helps to make up your mind.
Regards,
Tom
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
34 | |
30 | |
18 | |
12 | |
8 |
User | Count |
---|---|
50 | |
35 | |
30 | |
14 | |
12 |