Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
I'm loading a .json file into Power Query, but for some reason it seems to ignore the decimal points. So 984.25 is being turned into 98425.
The file (in Notepad++):
PowerQuery:
I'm dealing with 700+ .json files in total, for a combined 900.000 part numbers. So editing the files manually isn't an option.
I tried looking for additional setting in PowerQuery, but couldn't find anything related. Any help would be greatly appreciated!
EDIT: So I figured out that the issue occurs when importing multiple files via a folder, as opposed to a single .json file. Doesn't solve the problem, though.
Solved! Go to Solution.
Solution: Someone else suggested changing the locale of the entire pbix file, which did the trick.
Thanks @ronrsnfld for nudging me in the right direction.
Thanks again. Using Edge as well. Not sure what is happening there. All sorts of weirdness on this end, I guess.
As for the decimals, I will just keep tinkering with it. At least I have a workaround. 🙂
Thanks. Appreciate your taking a look. It is definitely perplexing.
Adding to the fun, it seems my misunderstanding is coming from the screenshot. Everything is so tiny. Zooming in blurs it all.
That is odd. When I select the screenshot, the image is larger and perfectly clear. In any event, the Shift Length shows up as decimal numbers, with no trailing zero's. Maybe something about how your browser is rendering that. FWIW, I am using Microsoft Edge.
Thanks for the reply and my apologies. I did not think it necessary to restate the problem identified in the original topic, but I did include additional information.
PowerQuery is ignoring decimals in JSON.
As mentioned, after trying everything noted in the original thread, I am curious to know if there is another solution to this.
Can anyone tell me what the problem is here and how to fix this, aside from my current workaround?
Thanks in advance ... Sample JSON follows.
[ {"Name":null,"Location":"ABC","Team":"Direct","Date":"December 17, 2022","Start":"10:00 AM","End":"06:00 PM","Shift Length":8.0000000000000000,"Schedule Type":"Vacant"}, {"Name":null,"Location":"BCA","Team":"Broad","Date":"June 27, 2023","Start":"09:00 AM","End":"04:15 PM","Shift Length":7.2500000000000000,"Schedule Type":"Vacant"}, {"Name":null,"Location":"BCA","Team":"Broad","Date":"September 29, 2023","Start":"09:00 AM","End":"04:15 PM","Shift Length":7.2500000000000000,"Schedule Type":"Vacant"}, {"Name":null,"Location":"CBA","Team":"Direct","Date":"April 13, 2023","Start":"03:30 PM","End":"10:30 PM","Shift Length":7.0000000000000000,"Schedule Type":"Vacant"}, {"Name":null,"Location":"CBA","Team":"Direct","Date":"April 14, 2023","Start":"07:30 AM","End":"10:30 PM","Shift Length":15.0000000000000000,"Schedule Type":"Vacant"}, ]
Currently, my workaround is to simply recalculate [Shift Length].
As you can tell from my screenshot, I cannot reproduce your problem when opening the JSON from a text file. Also, I cannot reproduce it no matter my windows regional settings. But since the Record you examined at your Source step already shows the problem, there is possibly an issue with the Connection as to how it is interpreting that value. I don't know how to get into that.
After trying everything noted above, I am curious to know if there is another solution to this.
Can anyone tell me what the problem is here and how to fix this, aside from my current workaround?
Thanks in advance ... Sample JSON follows.
[
{"Name":null,"Location":"ABC","Team":"Direct","Date":"December 17, 2022","Start":"10:00 AM","End":"06:00 PM","Shift Length":8.0000000000000000,"Schedule Type":"Vacant"},
{"Name":null,"Location":"BCA","Team":"Broad","Date":"June 27, 2023","Start":"09:00 AM","End":"04:15 PM","Shift Length":7.2500000000000000,"Schedule Type":"Vacant"},
{"Name":null,"Location":"BCA","Team":"Broad","Date":"September 29, 2023","Start":"09:00 AM","End":"04:15 PM","Shift Length":7.2500000000000000,"Schedule Type":"Vacant"},
{"Name":null,"Location":"CBA","Team":"Direct","Date":"April 13, 2023","Start":"03:30 PM","End":"10:30 PM","Shift Length":7.0000000000000000,"Schedule Type":"Vacant"},
{"Name":null,"Location":"CBA","Team":"Direct","Date":"April 14, 2023","Start":"07:30 AM","End":"10:30 PM","Shift Length":15.0000000000000000,"Schedule Type":"Vacant"},
]
Currently, my workaround is to simply recalculate [Shift Length].
You don't provide any information to reproduce whatever problem it is you are having. You don't even state the problem!
When I process that JSON in PBI desktop, it outputs:
Kindly point out what the problem is.
Solution: Someone else suggested changing the locale of the entire pbix file, which did the trick.
Thanks @ronrsnfld for nudging me in the right direction.
Glad you found a solution. BTW, in the future, a text version of your code would have been useful. There is no way to paste a partial screenshot into the Advanced Editor for testing and to see what might be happening.
Yeah, I realised that just now. One of the lines is extremely long, for some reason my first thought was to make a screenshot instead of just breaking it up over multiple lines... I swear I'm not an idiot. Thanks for the tip!
After you expand that column, and before the #"Changed Type" step, does the value look OK? If so, you need to edit the #"Changed Type" step to add the locale specifier "en-US" (or from the UI, right click on the column header and select Change Type => Using Locale => Decimal Number => English(United States) (possibly a different locale that uses the dot for decimal).
So I tried your suggestion on a whole new file, and I can tell where it goes wrong.
When I import a single .json file, it does indeed show the value properly before the #Change Type step, which allows me to change the locale specifier.
However, it doesn't work when importing them as a folder, instead of a singular file. Unfortunately, importing them manually, collapsing and combining them manually isn't an option for me.
Copy and paste your code from the advanced editor. I'm not at my computer now but I will take a look later today.
Sure. It's in Dutch, though, hopefully it makes any sense. I can translate, of course. Thanks.
Hi,
It doesn't. It's already missing the decimal point after expanding. Tried it anyway, but didn't do anything. Thanks for the suggestion, though.