Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
bdpr_95
Helper III
Helper III

Power BI Modeling Challenge: Decimal Precision vs Model Size

Hi everyone,

 

I’m currently optimizing a Power BI model and ran into a challenge:
There are two columns with decimal values exceeding 10 digits of precision, and it’s significantly increasing the size of the dataset.

The business requires full precision for accuracy reasons, so I can’t simply round or truncate the values without understanding the impact.

 

Has anyone faced a similar situation?
What strategies did you use to balance precision and performance?

I'll share a screenshot of the column structure shortly.

 

bdpr_95_0-1760615998953.png


Thanks in advance for your insights!

1 ACCEPTED SOLUTION
v-pnaroju-msft
Community Support
Community Support

Thankyou, @srlabhe, for your response.

Hi bdpr_95,

We appreciate your enquiry through the Microsoft Fabric Community Forum.

Based on my understanding, the increased model size is not caused by decimal precision alone, it may also result from high cardinality and the use of the Double data type, both of which reduce VertiPaq compression efficiency. When decimal values have many unique values or long precision, the storage engine cannot compress them effectively, resulting in larger model sizes.

Please follow the steps below, which may help to resolve the issue:

  1. Use Power BI Desktop’s Model view to identify which columns consume the most memory.
  2. If precision permits, convert from Decimal (Double) to Fixed Decimal Number in Power Query, as it provides exact precision and often compresses more effectively.
  3. Store values as scaled integers or pre calculate them at the source before loading into Power BI. Avoid using text unless absolutely necessary, as it further increases size.
  4. Compare model size after each change using VertiPaq Analyzer to quantify the precision versus performance trade-off.

Additionally, please refer to the link below:
Data reduction techniques for Import modeling - Power BI | Microsoft Learn

We hope the information provided helps to resolve the issue. Should you have any further queries, please feel free to contact the Microsoft Fabric community.

Thank you.




View solution in original post

3 REPLIES 3
v-pnaroju-msft
Community Support
Community Support

Hi bdpr_95,

We would like to follow up and see whether the details we shared have resolved your problem. If you need any more assistance, please feel free to connect with the Microsoft Fabric community.

Thank you.

v-pnaroju-msft
Community Support
Community Support

Thankyou, @srlabhe, for your response.

Hi bdpr_95,

We appreciate your enquiry through the Microsoft Fabric Community Forum.

Based on my understanding, the increased model size is not caused by decimal precision alone, it may also result from high cardinality and the use of the Double data type, both of which reduce VertiPaq compression efficiency. When decimal values have many unique values or long precision, the storage engine cannot compress them effectively, resulting in larger model sizes.

Please follow the steps below, which may help to resolve the issue:

  1. Use Power BI Desktop’s Model view to identify which columns consume the most memory.
  2. If precision permits, convert from Decimal (Double) to Fixed Decimal Number in Power Query, as it provides exact precision and often compresses more effectively.
  3. Store values as scaled integers or pre calculate them at the source before loading into Power BI. Avoid using text unless absolutely necessary, as it further increases size.
  4. Compare model size after each change using VertiPaq Analyzer to quantify the precision versus performance trade-off.

Additionally, please refer to the link below:
Data reduction techniques for Import modeling - Power BI | Microsoft Learn

We hope the information provided helps to resolve the issue. Should you have any further queries, please feel free to contact the Microsoft Fabric community.

Thank you.




srlabhe
Resolver III
Resolver III

I doubt precision is impacting performance, please chek the data size with rounding and without rounding and compare

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.