Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Sudhakar510
Helper I
Helper I

Power BI RAM Issues

Hi Experts,

We have 64 GB RAM sitting in Server and We have Power BI Reproting Server 

99% Reports are direct query and I understand that its not efficient to do it in this way.

 

So with this 99% Directquery Reports RAM Consumtion is like 80% because this is at Testing phase. People develop and test etc

For this Direct Query process When running the query it will take RAM Consumption and after result it sits into Power BI Report correct?

In  Prod it is 30% Consumption  for all Direct Query because we have only Release reports in it.

 

 

 

Now we would like to move to Import Method and lets say file size is having 500MB. When it refresh it consumes RAM but when it comes into Desktop, while we done with Visual and all, does the data still sits in RAM or Disk? If after refreshing if it sits in RAM that means total RAM Consumption will be 1GB correct? Or Will it sits into Disk after refreshing?

Please help as this is very important for us to understand for decission making 

Is it something related to Increase RAM?

 

Few are saying it will still sit in RAM few are saying still sit in Disk.. not sure..

 

How can i fix this import method? 

What should be the standard RAM size in Server?

 

 

 

 

1 ACCEPTED SOLUTION
josef78
Memorable Member
Memorable Member

If your model size is 500MB (is not same as file size, because file is double compressed), you need 500MB of RAM for model, + additional 500MB during processing, + additional RAM for source data (it can be much larger, because it is uncompressed raw data, but it's limited up two 8milions segments which processed parallely). Is not count easy, best way is check memory allocation during idle, and memory allocation during processing peak.

 

Also depends on number of reports, if you have 500MB model you need 500MB constantly, +e.g. 2GB during processing. But if you have 10 reports by 500MB, you need 5GB constantly, + and probably 2GB for processing (because you will not process everything at same time)

View solution in original post

4 REPLIES 4
josef78
Memorable Member
Memorable Member

If your model size is 500MB (is not same as file size, because file is double compressed), you need 500MB of RAM for model, + additional 500MB during processing, + additional RAM for source data (it can be much larger, because it is uncompressed raw data, but it's limited up two 8milions segments which processed parallely). Is not count easy, best way is check memory allocation during idle, and memory allocation during processing peak.

 

Also depends on number of reports, if you have 500MB model you need 500MB constantly, +e.g. 2GB during processing. But if you have 10 reports by 500MB, you need 5GB constantly, + and probably 2GB for processing (because you will not process everything at same time)

Thanks @josef78  very much,

So which means to do Import Method via Reporting Server, this is the process of consumption for RAM.

And it all depends upons Import Reports size from Reporting Server and based on RAM(64 GB) in Server(and also  depends upon Power Query steps, SQL Query writing as well) which means it should not be enough because currently already 80%  consumption happening in Development area..May be its not great way to this process import method if we don't extend RAM. 

Thanks for making me understanding this.

 

And what if we access Power BI Import process via Service, is it same consumption of RAM in Production Server? may be not for Power Query steps because it will take our local memory isn't it? and for SQL query in Import Method  will it take from Server Memory?

 

We just worried of Data Governance to make sure data is not going outside like end users should not export before we move to service and also our reporting service sits in Citrix currently so we want to take steps carefully before we move to service.

 

Any inputs highly appreciated.

Yes, generally, amount of RAM which you need for model (after processing) on your development in Power BI Desktop, you will need also on production environment. And amount of additional RAM which you need during processing in Power BI Desktop, you will need also on production environment (if you will process on same server). 

 

But, RAM required for processing is not very dependent on PowerQuery steps, more important is tabular compression (there not big difference if powerquery rewrite to sql, except filtering rows and collumns in case when query folding cannot be applied).

 

Eg. If you have table with 100mil rows with source size 10GB, and this table, after columnar compression, will take e.g. 500MB. You will need:

500MB RAM anytime.

and additional during processing:

+500MB RAM for working version

+1.6GB RAM for raw data (because it is process sequentially by two 8mil segments (segment size 8/100mil*10GB=0,8GB in this case)

it's mean you will need 500GB in idle, and 2,6GB during processing, in this case.

 

Be aware about loading large data which will not be used (columns or rows filtered-out during power query, or columns which you don't need in model), is better filter-out unnecessary data during query (by custom sql query, or db view, or also in powerquery with query folding (which translate power query steps down to sql)

 

Additional tips:

For large environments is recommended use scale-out deployment, and separate front-end power bi report servers and dedicated power bi report servers for processing.

 

For large models is better use dedicated SSAS instance for models with live connected PBI reports.

 

 

v-luwang-msft
Community Support
Community Support

Hi @Sudhakar510 ,

In Power Bi desktop, if you import data into desktop, data is stored locally within powerbi in pbix (powerbi file) file. For Power BI service, the data from the file is stored on Azure. The Power BI service is built on Azure, which is Microsoft’s cloud computing infrastructure and platform. More details, please review this blog.
Both on-prem and on Microsoft Azure SQL database, it depands the type of you connect data. If you upload using import model in desktop, it still store the data in the .pbix file. 

In Power BI service, if you are referencing SQL Azure DB, then you don't need a gateway. If it is on-premises, then you do. In both these cases, if you change the connection type to "import" model- then the data will be stored in the Power BI Azure backend. If you get data using a live connection, no data is stored in the cloud, just some caching. 

 

 

refer:

https://community.powerbi.com/t5/Service/Where-is-data-stored-on-importing-to-power-bi/m-p/288537 

 

 

Best Regards

Lucien

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.