Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Why is decimal precision culture dependent?

I wrote a test performing an assertion on a formatted string, when I noticed that the number of decimals in a string formatted with the percent format specifier (P) was different on some cultures.

In the example below, en-US uses two decimal digits, while the other sampled cultures use three.

What is the heuristic behind this? Is there a way, other than rounding, of normalizing the different cultures to the same precision?

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

Console.WriteLine(0.19999d.ToString("P", new CultureInfo("en-US"))); // "20.00%"
Console.WriteLine(0.19999d.ToString("P", new CultureInfo("en-GB"))); // "19.999%"
Console.WriteLine(0.19999d.ToString("P", new CultureInfo("en"))); // "19.999%"
Console.WriteLine(0.19999d.ToString("P", new CultureInfo("nb-NO"))); // "19,999 %"
Console.WriteLine(0.19999d.ToString("P", new CultureInfo("nl-NL"))); // "19,999%"

Runtime: 6.0.9
Platform: Windows

>Solution :

They are different because that’s what those cultures have determined to be appropriate. If you want a specific number of decimal places rather than accepting the culture-specific default then include that in your format specifier, e.g. "P2".

Aletrnatively, you can create your own CultureInfo or NumberFormatInfo that sets the relevant properties as you want.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading