Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Power BI premium & PPU allow models to exceed 10 GB when large dataset storage is enabled. I incorrectly assumed that this also meant you could publish a pbix over 10 GB, but that is not the case. It seems very strange that a dataset can be 100 GB, but we can only publish 10 GB. I understand that datasets will grow during refresh, but this seems like a very large discrepancy.
What is the recommended approach for publishing a pbix that is over 10 GB? The method we used to work around this is:
1. Create a temp filter on the underlying data source
2. Refresh the pbix. If size is over 10 GB, then go back to step 1.
3. Pubish to a PPU workspace
4. Remove filter/s set in step 1.
5. Refresh the published dataset allowing it to grow over 10GB.
My approach is very inefficient. I'm hoping there is a better way to publish large datasets.
Solved! Go to Solution.
Hello,
i ahve already did that using tabular editor
1- install tabular editor
2- open you power bi project and from there select external tools and than tabular editor
3- select deploy and provide you XMLA workspace address
4- you will find and empty model on you workspace that you can refresh directly from the service
5- connect you report to the published dataset
Hello,
i ahve already did that using tabular editor
1- install tabular editor
2- open you power bi project and from there select external tools and than tabular editor
3- select deploy and provide you XMLA workspace address
4- you will find and empty model on you workspace that you can refresh directly from the service
5- connect you report to the published dataset
Very interesting. So, publish the model definition then refresh from the service. Thank you
Did I answer your question? Mark my post as a solution!
You could try using a deployment pipeline then you can use a parameter as a filter to limit the rows in facts when working in the desktop and then when you transition deployment stages you can automate the change of the parameter and refresh the dataset to your full rowset
alternatively, you could look at the vertipaq analyser in dax studio to see if you can reduce the number of columns in your and thus cardinality in your dataset
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
145 | |
87 | |
66 | |
52 | |
45 |
User | Count |
---|---|
215 | |
90 | |
83 | |
66 | |
58 |