Hello,
as the other answers say, it's a matter of the kind how floating point values are stored.
atof() is the better choice. It should be declared as
double atof(
const char *string )
;. Use a double variable instead of a float and you will have a precision that is enough for your range of 99 to 0.000013. Then, when you use the correct format string for printf() the values are printed correct, rounded to the number of digits you defined in the format string.
Take care on the format string, the first number is the over all length of the output, use "%9.6f" for the same length on all outputs.
Another point is, take care on comparing floating point values.
This comparison may fail, because the result of atof() is .1500001 what is not the same as .15!
double dValue = atof( strInput);
if( .15 == dValue)
{
}
Instead use somthing like this:
double dValue = atof( strInput);
if( fabs( .15 - dValue) < 0.01)
{
}
The comparison value (0.01) depends on the tolerance you can accept.
Best regards