Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
I created a nifty little big data extraction from multiple endpoints to ADLS gen2. I have full control of the encoding of the blobs in the data lake endpoint.
Currently I'm using iso-8859-1 (UTF-8) since I have both danish and eastern european characters in my extractions.
All files are in json format, since the schema can differ in many of the sources.
Unfortunately, Power BI does not recognice the encoding on the json files in ADSL. This leads to danish and eastern europe characters being displayed as ? in the text lines, when imported into Power BI.
When the files are opened in Notepad++ encoding is clearly UTF-8 (as expected), but how do I get Power BI to display this encoding correctly.
Alternately, how should I encode the files to make Power BI play nice?
Any help would be greatly appreciated!
Data extracted from Dýnamics 365 CRM directly to the same ADLS behaves exactly the same. It is also UTF-8 in notepad++, but displays wierd characters in stead of the correct danish and eastern european characters.
Solved! Go to Solution.
It actually worked when I changed the line above from:
#"Added Custom" = Table.AddColumn(#"Content", "JSON", each Json.Document([#"Content"], TextEncoding.Utf8)),
to:
#"Added Custom" = Table.AddColumn(#"Content", "JSON", each Json.Document([#"Content"], TextEncoding.Windows)),
Thanks, darlove!
Hey @Anonymous ,
if you are using the function Json.Document( ... , ...)
Try TextEncoding.Utf8 as 2nd parameter.
Maybe you have to use the "Advanced Editor", to see what's really going on inside your query.
Hopefully, this provides some ideas to tackle your challenge.
Regards,
Tom
Stay safe, stay healthy, and happy holidays
Hey Tom,
Thank you for our speedy reply!
Where exactly ios that going to go?
let
Source = AzureStorage.DataLake("https://MyDL.dfs.core.windows.net/Endpoint/clients"),
#"Removed Other Columns" = Table.SelectColumns(Source,{"Content"}),
#"Parsed JSON" = Table.TransformColumns(#"Removed Other Columns",{},Json.Document),
#"Expanded Content" = Table.ExpandListColumn(#"Parsed JSON", "Content"),
#"Expanded Content1" = Table.ExpandRecordColumn(#"Expanded Content", "Content", {"id", "name", "is_active", "address", "statement_key", "created_at", "updated_at", "currency", "self"}, {"id", "name", "is_active", "address", "statement_key", "created_at", "updated_at", "currency", "self"})
in
#"Expanded Content1"
Hi darlove,
Thank you for your reply. That has certainly broadened my understanding of Power BI.
I guess my code should end up looking something like this when implementing the adjustments you describe:
let
Source = AzureStorage.DataLake("https://MyDL.dfs.core.windows.net/Endpoint/clients"),
#"Content" = Table.SelectColumns(Source,{"Content"}),
#"Added Custom" = Table.AddColumn(#"Content", "JSON", each Json.Document([#"Content"], TextEncoding.Utf8)),
#"Expanded JSON" = Table.ExpandListColumn(#"Added Custom", "JSON"),
#"Expanded JSON1" = Table.ExpandRecordColumn(#"Expanded JSON", "JSON", {"id", "name", "is_active", "address", "statement_key", "created_at", "updated_at", "currency", "self"}, {"id", "name", "is_active", "address", "statement_key", "created_at", "updated_at", "currency", "self"})
in
#"Expanded JSON1"
Unfortunately the result is the same. I still get the special characters looking like this:
Best
Axel
It actually worked when I changed the line above from:
#"Added Custom" = Table.AddColumn(#"Content", "JSON", each Json.Document([#"Content"], TextEncoding.Utf8)),
to:
#"Added Custom" = Table.AddColumn(#"Content", "JSON", each Json.Document([#"Content"], TextEncoding.Windows)),
Thanks, darlove!
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the September 2025 Power BI update to learn about new features.