Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
According to : Data reduction techniques for Import modeling - Power BI | Microsoft Learn
"Import models are loaded with data that's compressed and optimized, and then stored to disk by the VertiPaq storage engine. When source data is loaded into memory, it's possible to achieve 10x compression, and so it's reasonable to expect that 10 GB of source data can compress to about 1 GB in size. Further, when persisted to disk an extra 20% reduction can be achieved."
Question:
1. When a import mode report is publish to Power BI Service, how the data being compressed and optimized?
2. So after it has been compressed and optimized, it would then be achieve by 10x compression by VertiPag?
3. so the completed process is publish report > compressed and optimized > achieve by 10x compression by VertifPag ?
4 how is report has already been published to Power BI service, then call Refresh Schedule to refresh data? Enquiry result compressed and optimzied by Data gateway? then achieve in Power BI Service?
Solved! Go to Solution.
When a report in Import Mode is published to Power BI Service, the underlying data model (powered by VertiPaq) is highly compressed and optimized using columnar storage and dictionary encoding. The key optimizations include:
Yes, typically VertiPaq can achieve 10x compression, but the actual ratio depends on:
Not exactly, the correct sequence is:
When a report is already published to Power BI Service and a Scheduled Refresh runs:
🔹 The Data Gateway does not perform compression—it only facilitates query execution and data transfer. The actual compression happens inside Power BI Service, just like it does in Power BI Desktop.
Hi @pennyhoho117,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the community members for the issue worked. If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Thanks and regards
Hi @pennyhoho117 ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If our responses has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @pennyhoho117,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @pennyhoho117 ,
Thanks for reaching out to the Microsoft fabric community forum.
Thanks @powerbidev123 for such a detailed and thorough solution, in addition to their points I would like to mention.
When you publish a Power BI Desktop file to the Power BI service, you publish the data in the model to your Power BI workspace. The same is true for any reports you created in Report view. You’ll see a new semantic model with the same name and any reports in your Workspace navigator. Publishing from Power BI Desktop has the same effect as using Get Data in Power BI to connect to and upload a Power BI Desktop file. So this means if we import some data into desktop and after compression if its about 1 GB, if we publish it to Service the semantic model will weigh the same.
For the third question, please make a note that the correct order as mentioned by @powerbidev123 is Import data > compressed, optimized and achieve 10x compression by VertifPag > Publish.
If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS.
Thanks and Regards
so according to the point 3, the VertiPag Engine is installed together with Power BI Desktop?
so the size of pbix file is already compressed and optimizated, which cannot exceeds 1G limit, right?
but the pbix includes the visualization(report view), so visualization also count in 1G limit?
How if the pbix exceed 1G limit?
Hi @pennyhoho117 ,
Thanks for reaching out to the Microsoft fabric community forum.
VertiPaq works in the background to boost performance of Power BI reports.
Visualization also will include in the 1 gb but the report visuals themselves typically take up much less space compared to the data model.
If the pbix file increases 1 GB limit we have to optimize the data model to reduce it or switch to direct query mode
If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS.
Thanks and Regards
When a report in Import Mode is published to Power BI Service, the underlying data model (powered by VertiPaq) is highly compressed and optimized using columnar storage and dictionary encoding. The key optimizations include:
Yes, typically VertiPaq can achieve 10x compression, but the actual ratio depends on:
Not exactly, the correct sequence is:
When a report is already published to Power BI Service and a Scheduled Refresh runs:
🔹 The Data Gateway does not perform compression—it only facilitates query execution and data transfer. The actual compression happens inside Power BI Service, just like it does in Power BI Desktop.
so according to the point 3, the VertiPag Engine is installed together with Power BI Desktop?
so the size of pbix file is already compressed and optimizated, which cannot exceeds 1G limit, right?
but the pbix includes the visualization(report view), so visualization also count in 1G limit?
How if the pbix exceed 1G limit?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
48 | |
31 | |
27 | |
27 | |
26 |
User | Count |
---|---|
61 | |
56 | |
35 | |
31 | |
28 |