
there is still a possibility of a silent loss of data.
This would only be an issue in following cases: a) A class has more than 255 backward compatible versions. b) An archive has more than 64K different classes in it. It is currently inconcievable to me that such a program would ever be written. Setting the sizes of these types to larger values would make every single binary archive bigger and slower. The only think I might suggest is that one add an assertion so that if either of the above were to occur, the program would trap - at least in debug mode. So maybe one might use assert(t.t < max_value<unsigned char>
const unsigned char x = static_cast<unsigned char>(t.t);
assert(t.t < max_value<int_least16_t >
const int_least16_t x = static_cast<int_least16_t>(t.t);
Robert Ramey Jeff Flinn wrote:
I've posted the following ticket in light of the current bug sprint. ----------- The following save_override overloads cause possible loss of data warnings on MSVC8 and XCode3.1.2/gcc4.0.1
void save_override(const version_type & t, int) void save_override(const class_id_type & t, int) void save_override(const class_id_reference_type & t, int)
with their respective assignments:
const unsigned char x = t.t; const int_least16_t x = t.t;
While a possible fix would be:
const unsigned char x = static_cast<unsigned char>(t.t); const int_least16_t x = static_cast<int_least16_t>(t.t);
there is still a possibility of a silent loss of data.
We could be safer and use numeric_cast, but that would possibly impact code size and performance, and would introduce a library dependency.
Why are the xxx_type strong typedef's using int rather than the smaller types that are being serialized?
Any thoughts on the best solution?
Thanks, Jeff
_______________________________________________ Unsubscribe & other changes: http://lists.boost.org/mailman/listinfo.cgi/boost