I am trying to figure out the differences in Math (there shouldn’t be any) between a quick calculation in Swift and Objective-C:
Swift
let z1 = 1.0
let width = 0.02197265625
let extent = 4096.0
let dx = z1 * width / extent
print(dx)
// 5.364418029785156e-06
Objective-C
double z1 = 1.0;
double width = 0.02197265625;
double extent = 4096.0;
double dx = z1 * width / extent;
NSLog(@"%f", dx);
// 0.000005
Inputting the numbers into a calculator return what Objective-C provides.
What is Swift doing differently here?
>Solution :
Both languages return the same number but they are formatted differently. I assume that you understand the scientific notation 5.36e-06 is the same as 0.00000536.
The format specifiers of NSLog follow the IEEE printf specification. When describing the specifier f, it says:
If the precision is missing, it shall be taken as 6;
So it’s rounded to 6 digits after the point, without using scientific notation (like e-06). It means that all except a single digits are removed.
Swift’s print function doesn’t use format specifiers. Instead, the documentation state:
The textual representation for each item is the same as that obtained by calling
String(item).
It obviously choses to print it with scientific notation and the maximum number of relevant digits.