Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
I have a simple power query table with one column and 3 rows:
Column1
10000000000000000
1000000000000000
100000000000000
The first cell has 17 digits, the second has 16 digits and the third has 15 digits.
First I change the column type to Whole number:
#"Changed column type" = Table.TransformColumnTypes(Source, {{"Column1", Int64.Type}}),
Then I call function Table.ReplaceValue, which is supposed to add 1 to each of the values in Column1 and then replace the values in Column1 with the new value:
#"Add number" =Table.ReplaceValue(#"Changed column type", each [Column1], each ([Column1] + 1), Replacer.ReplaceValue, {"Column1"})
The result is:
Column1
10000000000000000
1000000000000001
100000000000001
Note that 1 was NOT added to the first cell having 17 digits in it!
If I try to add the number 3 instead of 1 in Table.ReplaceValue, I get this result:
Column1
10000000000000004
1000000000000003
100000000000003
Now the first cell has got 4 added!
Is there a limit the number of digits when adding??
However, if I add this large number 10000000000000 like this:
#"Add number" =Table.ReplaceValue(#"Changed column type", each [Column1], each ([Column1] + 10000000000000), Replacer.ReplaceValue, {"Column1"})
I get this result:
Column1
10010000000000000
1010000000000000
110000000000000
Which is as expected.
Anyone who knows why adding 1 and 3 to the 17-digit number gives this strange result?
Perhaps it has to do with the max digit number of Int64 being 19.
BTW, this may a duplicate post, as I registered my profile after posting the first one (which I then could not find)...
-----
Here is the complete M-code for the table:
Solved! Go to Solution.
This is a place where unfortunately the people coming up with text for the UI aren't in sync with the people responsible for the language specification. What the UI calls "Decimal Number" is not related to "decimal precision" for numeric operations. The "15 digits long" refers to "double precision". "Decimal precision" has considerably more digits of precision -- I want to say 29? -- but has a far smaller range for the scale. Internally, they use the .NET "decimal" type which I seem to remember is also the same as the OLEDB decimal type.
This causes a problem in BI and SSAS because the numbers are "not unique" according to the engine. This is because only the first fifteen numbers are reported and my ID's are 18 digits long with the last three making the string unique. Is there a fix for this?
Use type text or Int64.Type . Last months someone here was wondering why Power Query couldn't handle PostGreSQL Int128 types 🙂
By default, arithmetic is done using double precision. You can get the desired result in this case by specifying decimal precision instead of double precision; that is, "Value.Add([Column1], 1, Precision.Decimal)" instead of "[Column1] + 1".
Thanks. With Precision.Decimal, the addition works at least ut to 10^28. But as the doc says
"The largest precision that can be represented in a Decimal Number type is 15 digits long", I would expect less than 10^28. Is it safe with these large number of digits? How are they stored?
This is a place where unfortunately the people coming up with text for the UI aren't in sync with the people responsible for the language specification. What the UI calls "Decimal Number" is not related to "decimal precision" for numeric operations. The "15 digits long" refers to "double precision". "Decimal precision" has considerably more digits of precision -- I want to say 29? -- but has a far smaller range for the scale. Internally, they use the .NET "decimal" type which I seem to remember is also the same as the OLEDB decimal type.
Confirmed. Thank you @curth !
let
Source = {1..19},
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Added Custom2" = Table.AddColumn(#"Converted to Table", "Base", each "1" & Text.Repeat("0",[Column1])),
#"Changed Type" = Table.TransformColumnTypes(#"Added Custom2",{{"Base", Int64.Type}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "PlusOne", each [Base]+1),
#"Added Custom3" = Table.AddColumn(#"Added Custom", "ValueAdd", each Value.Add([Base], 1, Precision.Decimal)),
#"Added Custom1" = Table.AddColumn(#"Added Custom3", "Result.PlusOne", each [PlusOne]-[Base]),
#"Added Custom4" = Table.AddColumn(#"Added Custom1", "Result.ValueAdd", each Value.Subtract([ValueAdd],[Base],Precision.Decimal))
in
#"Added Custom4"
Hi @Eric7 ,
While we do "assign types" in numbers using Power Query, everything internally is done using float64, which means we get complete precision up to 2^53-1, which is a bit less than 10^16. After this, results like the one you've found tend to happen.
Check this out https://en.wikipedia.org/wiki/Double-precision_floating-point_format#Precision_limitations_on_intege...
That may be why the bug is happening, but it is still a bug, or bad marketing, or bad documentation.
The Float64 reason should be included in this documentation:
https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-data-types
Currently, the doc only says
"Note
The Power BI Desktop data model supports 64 bit integer values, but the largest number the visuals can safely express is 9,007,199,254,740,991 (2^53-1) due to JavaScript limitations. If you work with numbers in your data model above this, you can reduce the size through calculations before adding them to a visual".
One could argue that as long as Power Query M does offer the datatype Int64.Type, it should really be Int64 and not Float64.
Hi @Eric7 ,
Has reproduced it in my side and submited it internal, if there is any progress, I would update here as soon as possible. ICM: 226860443
Best Regards,
Community Support Team _ Yingjie Li
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Yay, bug!
let
Source = {1..20},
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Added Custom2" = Table.AddColumn(#"Converted to Table", "Custom0", each "1" & Text.Repeat("0",[Column1])),
#"Changed Type" = Table.TransformColumnTypes(#"Added Custom2",{{"Custom0", Int64.Type}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each [Custom0]+1),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom.1", each [Custom]-[Custom0])
in
#"Added Custom1"
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
32 | |
16 | |
12 | |
11 | |
9 |
User | Count |
---|---|
44 | |
24 | |
20 | |
14 | |
14 |