Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
I am running into an issue with some raw decimals being off by a very slight margin (< .0000001). From what I have gathered, this happens because Power Query stores these as float values and sometimes causes rounding errors (Power Query Precision: Avoid Rounding Errors - BI Gorilla).
My problem with this theory is:
1. I am not transforming these columns at all. It is raw decimal values
2. It is random values with no pattern (i.e. not just 1/3 decimals)
3. These columns are set to fixed decimal type in the semantic model and decimal (which is fixed decimal) in the Lakehouse
All this can obviously be worked around with functions and such, but I feel very strongly that I should not have to create workarounds for literal raw data values.
Solved! Go to Solution.
Hi @parkergeis ,
Thanks for reaching out to the Microsoft fabric community forum.
You're totally right to expect raw decimal values to behave consistently, and it’s frustrating when that precision shifts slightly even though you haven't transformed anything and are using fixed decimal types.
Even though your data is typed as Decimal or Fixed Decimal, the Direct Lake engine (which uses Spark and other backend tech) may still introduce tiny floating-point inaccuracies when reading or evaluating those numbers. That’s why you’re seeing values like 96,813.60000000001 instead of a clean 96,813.60 even though nothing's been changed.
This behavior isn't unique to your setup, it’s a known issue caused by how some engines handle decimal math behind the scenes.
To clean this up completely without needing to touch your Lakehouse or source, just create a calculated column in Power BI that explicitly rounds the value:
Rounded Total Amount = ROUND('your_table'[total_amount], 2)
Then use Rounded Total Amount in your visuals and calculations.
This forces Power BI to treat the value exactly as you intend, with two decimal places and no extra noise from floating-point evaluations.
If the response has addressed your query, please "Accept it as a solution" and give a "Kudos" so other members can easily find it.
Best Regards,
Tejaswi.
Community Support
Hi @parkergeis
The issue is caused by the numbers being stored in binary, values like 1.77636E-15 can sometimes appear instead of 0. This can happen even when no data transformation has happened because power query may still apply floating-point logic during refreshes or internal evaulations.
You can avoid this by specifying the following in your M code
List.Sum([amount], Precision.Decimal)
Hope this helps, please give a thumbs up and mark as solved if it does, thanks
Hi @parkergeis ,
Thanks for reaching out to the Microsoft fabric community forum.
You're totally right to expect raw decimal values to behave consistently, and it’s frustrating when that precision shifts slightly even though you haven't transformed anything and are using fixed decimal types.
Even though your data is typed as Decimal or Fixed Decimal, the Direct Lake engine (which uses Spark and other backend tech) may still introduce tiny floating-point inaccuracies when reading or evaluating those numbers. That’s why you’re seeing values like 96,813.60000000001 instead of a clean 96,813.60 even though nothing's been changed.
This behavior isn't unique to your setup, it’s a known issue caused by how some engines handle decimal math behind the scenes.
To clean this up completely without needing to touch your Lakehouse or source, just create a calculated column in Power BI that explicitly rounds the value:
Rounded Total Amount = ROUND('your_table'[total_amount], 2)
Then use Rounded Total Amount in your visuals and calculations.
This forces Power BI to treat the value exactly as you intend, with two decimal places and no extra noise from floating-point evaluations.
If the response has addressed your query, please "Accept it as a solution" and give a "Kudos" so other members can easily find it.
Best Regards,
Tejaswi.
Community Support
How would you implement this logic with a lakehouse and/or a Direct Lake semantic model off the lakehouse?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
14 | |
13 | |
12 | |
9 | |
8 |
User | Count |
---|---|
17 | |
10 | |
8 | |
8 | |
7 |