Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
arunbyc
Helper II
Helper II

Basic question about properly planning reports and Data Refresh - how it is done in the industry

I started building reports as I was learning Power BI and Dax, and so Publising and Service were not the areas I focused on initially. My reports basically involve two database tables that serve as fact tables in the reports and about six or seven database tables that serve as dimension tables. I have so far designed about seven reports. Each report uses a few of the columns from fact table1 and fact table 2 and some of the dimension tables, depending on the report. 

 

Because each report contains different columns from these tables (some may be common among all of them, but not all), I wrote separate SQL Views to set up fact table content of each report, then connected it to dimension tables in the data model. I saved each report as a different pbix. So far I have done about eight reports but more to go.

 

I am now learning about Service and publishing these. My question is this: Do I have to now refersh each report separately at each scheduled time? Would you have done anything differently from what I have done to minimize the refreshes needed? I am sure each refresh puts load on the database. I would like to do what is done in the industry and plan the additional reports properly (re-do these if necessary) so that I am not doing twenty refereshes at ech scheduled time, on the same tables because I wrote twenty different views for the twenty reports I designed.. 

 

I don't know if my question makes sense, but I have a gut feeling that I could have done this differently if I had gotten your input much earlier. Can you please let me know how do you set up your reports and refreshes to minimize load on the database?

Thanks

AR

1 ACCEPTED SOLUTION

10 REPLIES 10
SaiTejaTalasila
Super User
Super User

Hi @arunbyc ,

 

You can bring the data to a dataflow and you can import the data from dataflow to your reports.

You can use power automate flows for refreshing and to receive the status notifications success/failure.

 

I hope it will be helpful.

 

Thanks,

Sai Teja 

v-hashadapu
Community Support
Community Support

Hi @arunbyc , thank you for reaching out to the Microsoft Fabric Community Forum.

 
@3CloudThomas , @Martin_D have explained it correctly.

  1. Creating one comprehensive semantic model (or shared dataset) that includes all the necessary tables and columns for your reports is a smart move. This reduces the number of refreshes and the load on your database. Each report can then use the subsets of the data as needed. This also enhances consistency and manageability and leads to better performance and faster query responses, especially if the queries hit the cached data. A single model uses less RAM on the backend server, avoiding potential performance bottlenecks caused by swapping models in and out of memory.
  2. With a Power BI Pro license, you don't need any extra subscriptions or setups to use the live connection feature. It's included in your Pro license. This means you can connect to your semantic model (or shared dataset) in the Power BI service without having to import data every time. This approach is particularly beneficial as it ensures all your reports are always in sync with the latest data and you only need to refresh the data once in your semantic model.

 

If this helps, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details.
Thank you.

Can you please point to any videos or courses that explain this in detail? It gets too difficult to ask all questions that arise, mainly when my background is not a tech background.

Hi @arunbyc , thank you for reaching out to the Microsoft Fabric Community Forum.

There are several courses in youtube one can learn from. Below are the links to the videos and Microsoft Learn Documentation that i recommend. :
You can find Microsoft doucumentation on almost everything PowerBI here:
 1. https://learn.microsoft.com/en-us/power-bi/?wt.mc_id=powerbi_inproduct_settings
 2. https://learn.microsoft.com/en-us/training/powerplatform/power-bi

For Individual Courses try these:
1. https://community.fabric.microsoft.com/t5/Webinars-and-Video-Gallery/Learn-Power-BI-Tutorial-for-Beg...
2. https://www.youtube.com/watch?v=cyWVzAQF9YU
3. https://www.youtube.com/watch?v=xDgyDXyu6FQ
4. https://www.youtube.com/watch?v=ENZm_7jWU4Q
5. https://www.youtube.com/watch?v=MII8NIF98RM
6. https://learn.microsoft.com/en-us/power-bi/connect-data/refresh-data
7. https://learn.microsoft.com/en-us/power-bi/connect-data/service-gateway-onprem
8. https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-manage
9. https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-install
10. https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-directquery-about

If this helps, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details.
Thank you.

 

3CloudThomas
Super User
Super User

here is a good link to review your issue. It talks about publishing a semantic model (once) and connecting reports to that single source of truth to feed data in all reports. This, only haing to refresh the data in one model rather than each published report.

Connect to semantic models in Power BI - Power BI | Microsoft Learn

But don't most people simply use the import method? How do they all make the refresh less burdensome? Does making one most general datamodel, and having all my reports on different pages using only subsets of the tables and columns from this datamodel an option? Or does such an approach, though seems conceptually feasible, cause any practical problems? Or is what I am thinking not feasible at all?

Hi @arunbyc ,


@3CloudThomas  proposed the correct approach. Having one semantic model that you use for multiple reports is still an import mode model, if you choose so. This will exactly solve your concern that you need many refreshes for many semantic models that send many queries to your database. With this approach you would only have nine queries per refresh, two fact tables and seven dimension tables, into one semantic model.

 

This approach has additional technical advantages that are not so obvious to the report author:

 

  1. Having one data model for all reports means that the backend keeps only one cache for everyone which makes it more likely that a query hits the cache which gives you a faster user experience.
  2. Multiple semantic models also consume additional RAM each on the backend server, which, depending on the size of the semantic models, makes it more likely that Power BI starts swapping in and out semantic models from memory, also giving you a slower user experience.

So, my recommendation is also, load your 9 tables with all the columns you need in any report into one semantic model and use it for all your reports. If you have security concerns, that not everyone who gets access to some fileds should get access to all field you can also set up object level security to control access to columns by users, but that's an advanced feature.

 

Kind regrads,

Martin

Does not having one sematinc model and then say 8 or 9 pages of reprts from one that one model in one pbix accomplish this? Live link seems a little advanced for me as a beginner, nit knowing what that entails. If I do as above without a live link, does it cause any problems?

If we have a pro-license, does this live connection functionality come with it or do we have to subscribe or set up anything extra to get this running?

Live connection, in my experience, is quickly understood by beginners and does not come with a lot of side effects. The report in a seperate file is no differnt to the report in the same file as the semantic model. You need to be aware, if you have multiple reports connected to the semantic model, that changes in the semantic model, like deleting a measure, affects all these reports. But don't worry, once  you are hooked with this approach, there is a tool to track these dependencies, Measure Killer: Measure Killer | Brunner BI

 

Live connection is included in the Pro license.

 

When sharing reports connected to one shared semantic model, you need to make sure, that you share with each report users both, access to the report and access to the semantic model. If all users have a workspace role, this does not require additional effort.

 

Of course, alternatively you can create one report with multiple pages, all in the same file as the one semantic model. This will make it less convenient to target different audiences, but it also simplifies other things, like drillthrough. Your decision depends on the user experience you want to achieve.

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

Jan25PBI_Carousel

Power BI Monthly Update - January 2025

Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.

Jan NL Carousel

Fabric Community Update - January 2025

Find out what's new and trending in the Fabric community.