Skip to main content

For example the value 1.7 is read as 1.7000000000000002.

I guess the json utility is using float data type that cannot store just a 1.7. It stores the closest possible value = 1.7000000000000002.

I need to round the value to get rid of the extra digits.

But I don't know the string representation of the value in the JSON file . So I don't know the scale of the value. The only possibility I have is to set a fixed number in the AttributeRounders decimal places.

This is highly irritating!

It would be better if the JSON reader tool would also return the scale of the float so I can round the value properly.

Or it would be nice if I could set the tools to return the string of the float value and I do the conversion into a number my self.

Do you know a workaround for this problem?

 

This page is specifically for Python, but the general concept applies to all computer software, and explains what you're observing:

https://docs.python.org/3/tutorial/floatingpoint.html

There is no notion of precision or scale for floating point numbers in JSON. If that is needed, you may want to consider transmitting the floats as strings instead.


This page is specifically for Python, but the general concept applies to all computer software, and explains what you're observing:

https://docs.python.org/3/tutorial/floatingpoint.html

There is no notion of precision or scale for floating point numbers in JSON. If that is needed, you may want to consider transmitting the floats as strings instead.

I disagree, every programming language has float data types but also BigDecimal data type (in java, decimal.Decimal in python) where the value is stored internally like this: for example for 1.7 -- value: 17, scale: 1

There is no need to use the floating point type for reading string based JSON files since there is no huge calculations done where floating point calculation capabilities of the CPU would be needed. You don't do 3D graphics calculations.

Strange is also that most of FME functions have no problem with a value like "1.7". Only if you use the JSON reader you face the problem with the false precision. I guess the programmers of the JSON module shall think about switching their data type.

 


I disagree, every programming language has float data types but also BigDecimal data type (in java, decimal.Decimal in python) where the value is stored internally like this: for example for 1.7 -- value: 17, scale: 1

There is no need to use the floating point type for reading string based JSON files since there is no huge calculations done where floating point calculation capabilities of the CPU would be needed. You don't do 3D graphics calculations.

Strange is also that most of FME functions have no problem with a value like "1.7". Only if you use the JSON reader you face the problem with the false precision. I guess the programmers of the JSON module shall think about switching their data type.

 

It's possible to make the argument that FME could use fixed point data types internally when parsing a JSON, but they can be an order of magnitude slower than floating point data types, which can make a noticeable difference when processing large datasets.

Another argument is that FME would then possibly return unexpected (i.e. non-standard) results in certain edge cases, see e.g. https://randomascii.wordpress.com/2020/09/27/floating-point-in-the-browser-part-1-impossible-expectations/ and the following posts, which I think would be a mistake from an interoperability perspective. At least that's my personal opinion, but then again, I'm not a developer at Safe ;-)

You can always use an AttributeRounder after reading your JSON dataset if being 0.0000000000000002 off causes problems.


I disagree, every programming language has float data types but also BigDecimal data type (in java, decimal.Decimal in python) where the value is stored internally like this: for example for 1.7 -- value: 17, scale: 1

There is no need to use the floating point type for reading string based JSON files since there is no huge calculations done where floating point calculation capabilities of the CPU would be needed. You don't do 3D graphics calculations.

Strange is also that most of FME functions have no problem with a value like "1.7". Only if you use the JSON reader you face the problem with the false precision. I guess the programmers of the JSON module shall think about switching their data type.

 

Sorry, I'm not trying to be complicated but maybe somebody from FME developers sees the problem and they address it in future releases.

You are right, I do use the AttributeRounder. But you need two things for the AttributeRounder: The number and the "decimal places".

So question is now where to get the decimal places from?

Value: 1.7 would need to be rounded with value: 1.700000000000002 and decimal places 1.

But a value: 1.7443342 you need the round value: 1.7443342000000002 with decimal places 7. (just to give you an example, it is probably not the right floating point representation, since I typed some number into the text)

It can of course be that you are lucky and you can set a fixed value for the "decimal places" because you don't need more than a certain precision in your calculations.

 

For the speed issue you are right. For the geographic calculations you need float values when you have large data. But if you internally remember the scale of the float value then you can always return the right value back to the user.

So you store the value with float 1.7000000000000002 (scale 1) and return to the user 1.7 when he want's to put it into a XML output file.

 

 

 

 

 

 

 


I disagree, every programming language has float data types but also BigDecimal data type (in java, decimal.Decimal in python) where the value is stored internally like this: for example for 1.7 -- value: 17, scale: 1

There is no need to use the floating point type for reading string based JSON files since there is no huge calculations done where floating point calculation capabilities of the CPU would be needed. You don't do 3D graphics calculations.

Strange is also that most of FME functions have no problem with a value like "1.7". Only if you use the JSON reader you face the problem with the false precision. I guess the programmers of the JSON module shall think about switching their data type.

 

For what it's worth, I think it's a very interesting discussion and much can be said for both approaches :-)


Reply