
Robert Ramey wrote:
there is still a possibility of a silent loss of data.
This would only be an issue in following cases:
a) A class has more than 255 backward compatible versions. b) An archive has more than 64K different classes in it.
That would be 32K different classes, the data type is int_least16_t.
It is currently inconcievable to me that such a program would ever be written. Setting the sizes of these types to larger values would make every single binary archive bigger and slower.
I also remember paying $800 for 4MB of ram. ;-)
The only think I might suggest is that one add an assertion so that if either of the above were to occur, the program would trap - at least in debug mode. So maybe one might use
assert(t.t < max_value<unsigned char>
I couldn't find a max_value<> template, I assume you mean assert(t.t <= boost::integer_traits<unsigned char>::const_max);
const unsigned char x = static_cast<unsigned char>(t.t);
assert(t.t < max_value<int_least16_t >
and assert(t.t <= boost::integer_traits<int_least16_t>::const_max); Thanks, Jeff