Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
Iwanicki
Frequent Visitor

Locale settings should not affect CSV import

My team and I spent a lot of time figuring out why the decimal numbers imported from a CSV file are incorrect - the delimiter was gone and the values were "multiplied" by 100. First, we thought there was something wrong with the arguments passed to the Source step.

We only discovered the source of our issue when we copy-pasted the M code from the PBI Desktop's Power Query (Dutch locale) to a dataflow (English locale), and saw that suddenly the identical code return different (this time, expected) results.

This is not intuitive and contradicts the fundamental idea of Power Query as something that you can easily re-use across Microsoft tools. The output of the M code should not be affected by application's settings. It should be managed within the code itself, this is what most IT specialists expect.

1 ACCEPTED SOLUTION
AlexisOlson
Super User
Super User

The output of the M code should not be affected by application's settings.

I disagree. The application settings govern how the input is interpreted. Under different interpretations, identical data means different things and the output should reflect proper meaning. Since a CSV file doesn't contain reliable locale metadata, there isn't a straightforward way for Power Query to magically know what interpretation you expect it to use unless you specify it in the code or in the application settings.

View solution in original post

8 REPLIES 8
Anonymous
Not applicable

On German Windows 10, Power BI is still doing a "wrong" conversion from a dot to a decimal comma even if I specifically instruct M to treat my data as en-US locale. Furthermore, it stubbornly continues doing this if I change all Power BI desktop Options to en-us this still happens, and also if I change the regional settings before loading the data into the model.

I expect the dot to become the decimal point, not a decimal comma.

My input lines go correctly comma-separated into three columns. First as a string (General data type). When I then use the enforced conversion to a Decimal Number with a locale

#"Changed Type with Locale" = Table.TransformColumnTypes(#"Changed Type-", {{"SHAPE_LAT", type number}, {"SHAPE_LNG", type number}}, "en-US")

 

I receive decimal numbers but formatted with a comma, which can only come from the underlying German Windows 10, where I have Power BI Desktop (x64) freshly installed. The data comes from a Unicode "65001" CSV file.

The problem can only be in my handling or in Power BI Desktop but I believe this is a software bug as I do everything there to have my data interpreted as en-US. The conversion to comma doesn't look like a feature, or at least not transparent why it is happening or what I can do differently 🙂

Try yourself to create a map out of this

SHAPE_LNG,SHAPE_LAT,ID
-122.368931097715,47.6420144067831,1
-122.381958845658,47.5356980313462,1
-122.311580474748,47.5983275587497,2
-122.31988902204,47.6242673355149,3
-122.32235390814,47.6268446727558,4

I'm not sure I understand your process, but, if I do, things may be working as they should.

Please correct any process steps I have assumed below.

 

  • Your Windows Regional Settings are German, with comma as the decimal separator
  • The CSV file is comma-separated data, with dot as the decimal separator.
  • You import the CSV file, specifying en-US settings for the type conversion.
  • Your imported file now shows comma as the decimal separator.

 

If the above is the case, then it is as expected.

Your system will process the data as decimal numbers.

 

If you want your system to use the dot as a decimal separator, then you will need to change your Windows Regional Settings

Anonymous
Not applicable

Thanks for looking into this so quickly. Actually, I could resolve the issue myself by using this article Set a locale or region for data (Power Query), particulary the paragraph "Use a non-default locale setting on a Change Type operation") and this video https://youtu.be/QKgS3hrrmvw?t=155. The explanations work well for Power Query in Power BI as well.

 

My most important learning was that the automatic conversion was the culprit. It turned the numbers into Int64.Type. This step needs to be deleted and replaced by a (Change Type – With Locale…) step, which generates a conversion to number with a culture parameter at the end.  

= Table.TransformColumnTypes(#"Replaced Value", {{"SHAPE_LNG", type number},{"SHAPE_LNG", type number}}, "en-US") ,

By use of the culture parameter, I get the decimals right in the model, even though they are rendered with commas (, through which I was confused).

 

For completeness I must express my apologies for ill-placing my question as a Power Query CSV issue. The actual task of rendering the map visual could be resolved in Power BI Desktop through the correct use of „Don’t summarize“. 

 

In summary, through culture-specific type conversion in Power Query / M the coordinates from CSV worked as expected, despite being rendered with commas. They even work without type conversion as strings going into the map visual.

 

The rendering of decimals with commas in Power Query may be changed by Windows Regional Settings, but I caution this may not be desired because of possible side effects on other processes and legacy applications, and therefore only be a last resort.

AlexisOlson
Super User
Super User

The output of the M code should not be affected by application's settings.

I disagree. The application settings govern how the input is interpreted. Under different interpretations, identical data means different things and the output should reflect proper meaning. Since a CSV file doesn't contain reliable locale metadata, there isn't a straightforward way for Power Query to magically know what interpretation you expect it to use unless you specify it in the code or in the application settings.

I disagree.

 

What if you want to import several cav each with its own locale

 

Or one that is not standard

 

The obvious way would be the option for the user to define decimal point character for each csv import , the same way he defines delimiter.

Anonymous
Not applicable

It works as intended. If your values are comma separated, and you're using commas as decimals, how would a program know the difference between the two? They are the same character. So you told Power Query "this is Dutch" and then took the same code and said "this is English".

 

My gas tank doesn't show me liters instead of gallons when I drive the same car in a different country--unless I change a setting 🙂

Just kidding. I have no setting like that. 

--Nate

My gas tank doesn't show me liters instead of gallons when I drive the same car in a different country--unless I change a setting 

ronrsnfld_0-1643836983390.png

 

Just kidding. I have no setting like that. 

 

I have cars with those settings.  More handy with changing the speed from mph (US) to kph (Canada) though.

ronrsnfld
Super User
Super User

As far as I know, the CSV file does not include locale information.  And I'm not sure how PQ would be able to infer that from CSV contents.  However, you can tell PQ the locale of the incoming csv file in the third argument of the "Table.TransformColumnDataTypes" function.  With that information, it can do the proper transformations.

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors