Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi everyone,
I’m currently optimizing a Power BI model and ran into a challenge:
There are two columns with decimal values exceeding 10 digits of precision, and it’s significantly increasing the size of the dataset.
The business requires full precision for accuracy reasons, so I can’t simply round or truncate the values without understanding the impact.
Has anyone faced a similar situation?
What strategies did you use to balance precision and performance?
I'll share a screenshot of the column structure shortly.
Thanks in advance for your insights!
Solved! Go to Solution.
Thankyou, @srlabhe, for your response.
Hi bdpr_95,
We appreciate your enquiry through the Microsoft Fabric Community Forum.
Based on my understanding, the increased model size is not caused by decimal precision alone, it may also result from high cardinality and the use of the Double data type, both of which reduce VertiPaq compression efficiency. When decimal values have many unique values or long precision, the storage engine cannot compress them effectively, resulting in larger model sizes.
Please follow the steps below, which may help to resolve the issue:
Additionally, please refer to the link below:
Data reduction techniques for Import modeling - Power BI | Microsoft Learn
We hope the information provided helps to resolve the issue. Should you have any further queries, please feel free to contact the Microsoft Fabric community.
Thank you.
Hi bdpr_95,
We would like to follow up and see whether the details we shared have resolved your problem. If you need any more assistance, please feel free to connect with the Microsoft Fabric community.
Thank you.
Thankyou, @srlabhe, for your response.
Hi bdpr_95,
We appreciate your enquiry through the Microsoft Fabric Community Forum.
Based on my understanding, the increased model size is not caused by decimal precision alone, it may also result from high cardinality and the use of the Double data type, both of which reduce VertiPaq compression efficiency. When decimal values have many unique values or long precision, the storage engine cannot compress them effectively, resulting in larger model sizes.
Please follow the steps below, which may help to resolve the issue:
Additionally, please refer to the link below:
Data reduction techniques for Import modeling - Power BI | Microsoft Learn
We hope the information provided helps to resolve the issue. Should you have any further queries, please feel free to contact the Microsoft Fabric community.
Thank you.
I doubt precision is impacting performance, please chek the data size with rounding and without rounding and compare
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 39 | |
| 37 | |
| 33 | |
| 33 | |
| 29 |
| User | Count |
|---|---|
| 134 | |
| 96 | |
| 78 | |
| 67 | |
| 65 |