Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
lszymk97
Frequent Visitor

Need help: Merging multiple reports data into one central database in PowerBI service

Recently I joined a company as a data analyst. My task is to merge many published reports with seperate databases into one central database. From this database we would like to connect other reports in Power BI service.

Do you have any information on best practices to 1) creating one central data model from seperate reports; 2) how to efficiently model central data model I.e. multiple fact tables and dims. 3) how to connect reports to pull the data from other pbi report or dataset (semantic model?) in power bi service.

1 ACCEPTED SOLUTION
Anonymous
Not applicable

HI @Tutu_in_YYC ,

1.Dataflow uses Power BI internal storage to store data by default, and does not enforce Azure Data Lake Storage Gen2 access.Gen 1 you only need to have Pro access or PPU access, but you encounter (store large amounts of data or cross-service integration), only to use to Azure Gen 2, for most of the For most scenarios, the built-in storage is sufficient.

vxingshenmsft_0-1728633788009.png

Configure and consume a dataflow - Power BI | Microsoft Learn


2. Regarding the DAX computation of Dataflow data after it enters the semantic model, your understanding is absolutely correct! Dataflow is responsible for providing the cleaned and processed data to the semantic model. In the semantic model, you can centrally create and manage DAX metrics, and these metrics can be reused by multiple reports. If you update DAX calculations in the semantic model, all reports that rely on the model automatically apply the updates, simplifying the management process.
3. Dataflow is well suited to handle data from SQL Server and Excel files. Its ability to unify the processing of this data makes it suitable for building centralized data models. If you don't need to use Azure Data Lake Gen2, Dataflow doesn't come at an additional cost and you can use the storage built into Power BI.

Hope it helps!

Best regards,
Community Support Team_ Tom Shen

If this post helps then please consider Accept it as the solution to help the other members find it more quickly.

 

 

 

View solution in original post

5 REPLIES 5
Anonymous
Not applicable

Hi ALL,
Firstly  Tutu_in_YYC thank you for your solution!
And @lszymk97 , If you want to create a Central Data Model, you can try Dataflow in Power BI Service, which provides a centralized ETL (Extract, Transform, Load) platform that allows you to integrate data from different reports into a central data structure.Dataflow can be provided as a data source to multiple Power BI reports in the Power BI service, and multiple reports can share the same central data model by connecting to the same Dataflow. This avoids duplicate data extraction and processing and ensures that all reports use the same, uniform data.

vxingshenmsft_0-1728623596939.png

Introduction to dataflows and self-service data prep - Power BI | Microsoft Learn

You can refer to the figure below, where we first consolidate all the data into a central dataset, clean it up, and then go back to Power BI desktop to define the relationships between the different tables (e.g., joining a fact table to a dimension table), and then publish it as a shared dataset to be used by other reports.

vxingshenmsft_1-1728624965708.png

I hope my answer can help you solve the problem, if you have further questions, you can contact me at any time, I will be the first time to reply to you after receiving your message!

Hope it helps!

Best regards,
Community Support Team_ Tom Shen

If this post helps then please consider Accept it as the solution to help the other members find it more quickly.

 

 

 

Thank you for your reply I think I will flag it as a solution. But I have couple of questions: 1) would dataflow require azure gen2 Access? 2) I understand that data from dataflow would go into one semantic model where we could bulk update our DAX measures for different reports pulling the data from this semantic model? 3) Also, we have a ms sql database with excel files mainly, do you think dataflow is correct for these bearing in mind additional costs coming from having gen2 dataflow in Azure.

 

Thanks, 

Lukasz

Anonymous
Not applicable

HI @Tutu_in_YYC ,

1.Dataflow uses Power BI internal storage to store data by default, and does not enforce Azure Data Lake Storage Gen2 access.Gen 1 you only need to have Pro access or PPU access, but you encounter (store large amounts of data or cross-service integration), only to use to Azure Gen 2, for most of the For most scenarios, the built-in storage is sufficient.

vxingshenmsft_0-1728633788009.png

Configure and consume a dataflow - Power BI | Microsoft Learn


2. Regarding the DAX computation of Dataflow data after it enters the semantic model, your understanding is absolutely correct! Dataflow is responsible for providing the cleaned and processed data to the semantic model. In the semantic model, you can centrally create and manage DAX metrics, and these metrics can be reused by multiple reports. If you update DAX calculations in the semantic model, all reports that rely on the model automatically apply the updates, simplifying the management process.
3. Dataflow is well suited to handle data from SQL Server and Excel files. Its ability to unify the processing of this data makes it suitable for building centralized data models. If you don't need to use Azure Data Lake Gen2, Dataflow doesn't come at an additional cost and you can use the storage built into Power BI.

Hope it helps!

Best regards,
Community Support Team_ Tom Shen

If this post helps then please consider Accept it as the solution to help the other members find it more quickly.

 

 

 

Tutu_in_YYC
Super User
Super User

Are you trying to consolidated databases or semantic models? You have to be reallly specific and careful with the terms, as the approach is gonna be totally different.

Hi, I'm trying to merge different semantic models into one. We have many reports with seperate data sources so I'd like to merge all of underlying data sources into one central semantic model/data flow.

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.