Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
PaulBFelix
Advocate II
Advocate II

Publish pbix over 10 GB

Power BI premium & PPU allow models to exceed 10 GB when large dataset storage is enabled.  I incorrectly assumed that this also meant you could publish a pbix over 10 GB, but that is not the case.  It seems very strange that a dataset can be 100 GB, but we can only publish 10 GB.  I understand that datasets will grow during refresh, but this seems like a very large discrepancy. 

 

What is the recommended approach for publishing a pbix that is over 10 GB?  The method we used to work around this is:

1. Create a temp filter on the underlying data source

2. Refresh the pbix.  If size is over 10 GB, then go back to step 1.

3. Pubish to a PPU workspace

4. Remove filter/s set in step 1.

5. Refresh the published dataset allowing it to grow over 10GB.

 

My approach is very inefficient.   I'm hoping there is a better way to publish large datasets.

1 ACCEPTED SOLUTION
CharbelArmaleh
Resolver II
Resolver II

Hello,

i ahve already did that using tabular editor

1- install tabular editor

2- open you power bi project and from there select external tools and than tabular editor

3- select deploy and provide you XMLA workspace address

4- you will find and empty model on you workspace that you can refresh directly from the service

5- connect you report to the published dataset

 

 

View solution in original post

4 REPLIES 4
CharbelArmaleh
Resolver II
Resolver II

Hello,

i ahve already did that using tabular editor

1- install tabular editor

2- open you power bi project and from there select external tools and than tabular editor

3- select deploy and provide you XMLA workspace address

4- you will find and empty model on you workspace that you can refresh directly from the service

5- connect you report to the published dataset

 

 

Very interesting.  So, publish the model definition then refresh from the service.  Thank you

Did I answer your question? Mark my post as a solution!

JohnShepherdAPD
Helper II
Helper II

You could try using a deployment pipeline then you can use a parameter as a filter to limit the rows in facts when working in the desktop and then when you transition deployment stages you can automate the change of the parameter and refresh the dataset to your full rowset

 

alternatively, you could look at the vertipaq analyser in dax studio to see if you can reduce the number of columns in your and thus cardinality in your dataset

 

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.