I do understand that this issue has already been debated to the point of a new implementation. I just want to ask about a side point I didn't quite understand earlier. François Mauger wrote:
In fact, since we are only rendering the characters "[-+e.0-9]" we could use a modified BCD or other compressed format to provide the compression that is typically what people assume in binary formats.
ok this is only a set of 14 glyphs so it could be hosted via short ints (with 2 bits unused)
? I think each of 14 glyphs could be represented in 4 bits, with 2 bit patterns left over.
consider a typical float (relative precision ~1e-7). If one need to store pi as +0.3141592e+01 (ASCII) it is 14 characters (only 11 is one saves leading'+' and exponent '+0' chars for >0 mantissa and exponent) that could be serialized using 14/11 shorts, so this is 28/22 bytes. This has to be compared with 4 bytes for floats!
It seems to me that 14 characters in the constrained glyph set could be represented with 14 4-bit "nybbles," or 7 bytes. It's still worse than 4 bytes, but by a factor < 2 rather than ~6. Please forgive me if I've misunderstood you.