I’m experimenting with fractions.Fraction and I’m wondering how to convert them into decimal strings, possibly specifying the precision. Say we have:
from fractions import Fraction
a = Fraction.from_float(1e10)
b = Fraction.from_float(1e-10)
c = a + b
str(c) returns a string based on the numerator/denominator of the fraction, i.e. 773712524553362671819689765245533627/77371252455336267181195264.
f"{c:.20f}" errors with TypeError: unsupported format string passed to Fraction.__format__.
Converting back to float via float(c) allows to obtain a string, but of course loses the precision again.
What I’m interested is to represent c as a string like "1000000000.0000000001[...]". Is there a way to accomplish that?
>Solution :
I think the best you can do is just piggy-back on top of decimal.Decimal, so:
from fractions import Fraction
from decimal import Decimal
a = Fraction.from_float(1e10)
b = Fraction.from_float(1e-10)
c = a + b
print(f"{Decimal(c.numerator) / Decimal(c.denominator):10.10f}")
Actually, I would use:
def fraction_to_decimal_string(fraction, precision=16):
with decimal.localcontext() as context:
context.prec = precision
decimal_string = str(decimal.Decimal(c.numerator) / decimal.Decimal(c.denominator))
return decimal_string
Then you can check:
print(fraction_to_decimal_string(c, 21))
print(fraction_to_decimal_string(c, 30))
print(fraction_to_decimal_string(c, 50))
For example, the above gives me:
10000000000.0000000001
10000000000.0000000001000000000
10000000000.000000000100000000000000003643219731550