Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
dbadmin
Helper IV
Helper IV

Power BI's In-memory store - for clarification purposes... how does it work exactly?

I am part of our company's R&D, also the report author for a different part of the company. 

 

I have been researching how to create a reporting database instead of trying to run reports through our production database. Then I kind of had this "aha" moment. Power BI (from what I understand) has an in-memory store (hence why the data has to be refreshed).  I also just figured out a rather difficult task is extremely simple in Power BI. What takes me quite a bit of manipulation on the database side - only takes a few steps in Power BI. It's much more powerful that I first anticipated. 

 

My thought is this - that instead of creating a second reporting database > connect that to the production database > setup a data refresh (to make sure we have current data) > THEN connect Power BI to the reporting database... I'm thinking I can skip the middle man. 

 

In order to explain this to my supervisors correctly... Do I have this concept right? How does the in-memory store work exactly and is this as efficient and effective as I think it will be?  What's holding the data in the store on the Power BI side (I'm just curious about this one)? 

Any help, opinion, references, stories of experience are welcome. Thanks in advance! 🙂 

3 REPLIES 3
synergised
Resolver II
Resolver II

As long as a nightly snapshot of the reporting data is all you need.  If you need more current data, you would still have to take the hit against production data if you need to refresh the report during the day.

 

At my previous job, we ran the majority of the web site off of reporting servers.  The four reporting servers were using log shipping from production and had a lag of 1 to 5 minutes. 

 

Simplified for your question:

1.  Create your Power BI report connected to your production database.

2.  Upload Power BI report to the Portal

3.  Use the Enterprise gateway to refresh the reports (datasets) nightly (currently a big guess if will refresh on time or if at all with the current version of the gateway)

4.  Use the Power BI portal during the day.

 

Power BI accounts have a 10gb limit

Power BI reports currently have a 250 mb size limit

TealCanady
Advocate II
Advocate II

@dbadmin

So you do have the concept down but there are a variety of factors which would lead you to making a datawarehouse / tabular model as opposed to doing it directly in Power BI desktop. Data size is going to be the big one (how much RAM is being consumed); also allowing a file into the wild at your organization with 50 people hitting refresh from production could be a nightmare. You can mitigate some of these issues with One Drive being your go to and refreshing hourly to the online portal; but what I would personally be concerned with is permissions and size vs. your client computers. In addition many users prefer Excel to the desktop power bi (while these features are also available in Excel it can be difficult if your user base has many versions of Excel).

The in memory engine is called Vertipaq and was originally aquired by Microsoft in 2010 (I think). This has been available in Excel since that time. Follow the link above for some more information on the engine.

 

Cheers,

 

Teal Canady

@TealCanady and @synergised

 

The benefit of our setup is we WON'T have 50 users needing data on a daily basis. In reality only a select few will need to access the reports. We would like to display a dashboard on a large monitor on the factory floor with "live" data - in reality this will be updated about every 5 minutes.

 

I discovered a tool called Power Update that is supposed to work in refreshing Power BI but I haven't gotten it to work quite well.

 

In this sense - I guess our best option IS going to be to have a separate reporting database. I have been in the progress of researching HOW to do that exactly. We are using Postgres databases - so I'm trying to figure out how to automate the refresh processes between the two. 

 

This was helpful information. Thank you so much - I just wanted to see what our options are and see it would be beneficial in our case to eliminate some steps, but I don't think it will be in this case. Thanks so much for the input and advice! 🙂 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Jan25PBI_Carousel

Power BI Monthly Update - January 2025

Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.