Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now

Reply
RonaldMussche
Frequent Visitor

Best solution for files that are to large

Hi,

 

Currently we have a pbix file that is exactly 1GB. I already analysed the file with the vertpaq analyzer and removed all the colums we don't need. The file is publishing and refreshing now but when it grows it will fail of course. 

 

I have investigated serveral solutions:

- Premium capacity (Way to expensive for only this feature > 4200 euros per month)

- Permium per user (But then not everybody without a premium license) can see the report.

- Azure analisys server and connect the power bi report to the cube.

- Using direct query (to slow for 180M> rows.)

- Aggregated table for the fact table. But I think the granuality is to low to really save data.

 

Currently my gut feeling is to create an SSAS cube in azure but I was wondering if I am right here or if there is an easier solution or advice based on experience.

 

Thanks in advanced,

Ronald

1 ACCEPTED SOLUTION

Yes, Azure analysis services will be a good option and also as @SwayamSinha suggested Pause the Azure analysis services when not in used to save money.


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!

View solution in original post

4 REPLIES 4
SwayamSinha
Microsoft Employee
Microsoft Employee

I would agree with the first two points.

Connecting the report live to Azure analysis cube is definitely possible. Note that is some costs & provisioning overhead involved there too , though you can always pause AAS when not to be in use 🙂

I would recommend Direct Query to the data source. I believe DQ are now a lot faster , given that query folding is applied and also because of Query fusion capabilities, read Announcing “Horizontal Fusion,” a query performance optimization in Power BI and Analysis Services |....

Hi, 

 

I tried the direct query. but I run into the limitation of > 1M rows

https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-directquery-about

The underlying data in the report has indeed more then this amount. So I guess this is not an option. Am I correct that SSAS or premium is the only option here?

Yes, Azure analysis services will be a good option and also as @SwayamSinha suggested Pause the Azure analysis services when not in used to save money.


If this post helps, then please consider Accept it as the solution, Appreciate your Kudos!!
Proud to be a Super User!!
SwayamSinha
Microsoft Employee
Microsoft Employee

I would agree with the first two points.

Connecting the report live to Azure analysis cube is definitely possible. Note that is some costs & provisioning overhead involved there too , though you can always pause AAS when not to be in use 🙂

I would recommend Direct Query to the data source. I believe DQ are now a lot faster , given that query folding is applied along with Query fusion capabilities among others. Read Announcing “Horizontal Fusion,” a query performance optimization in Power BI and Analysis Services |....

If your data source is architected well ( with indexes, materialised views etc) your Power BI report will be faster enough.

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.

Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.