On Fri, Sep 25, 2020 at 7:07 AM Andrzej Krzemienski via Boost
Are JSON numbers only good for storing int-based identifiers?
The JSON specification is silent on the limits and precision of the range of numbers. All that we know is that it is a "light-weight data interchange format." However, we can gather quite a bit of anecdotal evidence simply by looking at the various languages which have built-in support for JSON.
From RFC7159 (https://tools.ietf.org/html/rfc7159)
This specification allows implementations to set limits on the range and precision of numbers accepted. Since software that implements IEEE 754-2008 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision. A JSON number such as 1E400 or 3.141592653589793238462643383279 may indicate potential interoperability problems, since it suggests that the software that created it expects receiving software to have greater capabilities for numeric magnitude and precision than is widely available. Note the phrase "widely available."
From https://stackoverflow.com/questions/13502398/json-integers-limit-on-size
As a practical matter, Javascript integers are limited to about 2^53 (there are no integers; just IEEE floats).
...a 64-bit integer cannot be represented in JSON (since JavaScript and JSON support integers up to 2^53).
When to use? Only in some special cases. For example when you have to create some sort of data processing middleware which has to process arbitrary JSON without risk of screwing up. JSON objects containing big numbers are rare in the wild.
From https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Obj...
The JavaScript Number type is a double-precision 64-bit binary format IEEE 754 value, like double in Java or C#....When parsing data that has been serialized to JSON, integer values falling outside of this range can be expected to become corrupted when JSON parser coerces them to Number type. A possible workaround is to use String instead.
From https://docs.python.org/3/library/json.html#implementation-limitations
When serializing to JSON, beware any such limitations in applications that may consume your JSON. In particular, it is common for JSON numbers to be deserialized into IEEE 754 double precision numbers and thus subject to that representation’s range and precision limitations. I am actually now starting to wonder if even 64-bit integer support was a good idea, as it can produce numbers which most implementations cannot read with perfect fidelity. It is true that there are some JSON implementations which support arbitrary-precision numbers, but these are rare and all come with the caveat that their output will likely be incorrectly parsed or rejected by the majority of implementations. This is quite an undesirable feature for an "interoperable, data-exchange format" or a vocabulary type. Support for arbitrary precision numbers would not come without cost. The library would be bigger, in a way that the linker can't strip (because of switch statements on the variant's kind). Everyone would pay for this feature (e.g. embedded) but only a handful of users would use it. There is overwhelming evidence that the following statement is false: "json::value *needs* to support arbitrary numbers. It's incomplete without it." Thanks