
On Feb 10, 2006, at 8:39 AM, Robert Ramey wrote:
David Abrahams wrote:
Matthias Troyer <troyer@itp.phys.ethz.ch> writes:
It's still not clear what this should be changed to - if in fact it should be changed at all. std::size_t is a candidate - but I was under the impression that there might be interest in defining a special type for this - like collection_size_t or ?
Indeed that's what's needed, and I have all the patches ready that would need to be applied to do it.
Please, guys, get this into 1.34. It's embarassing and a little frustrating that this problem has persisted so long.
It turns out that the internal library version number is going to be bumped from 3 to 4 in the next release. This has been necessary to implement a correction to versioning of items of collections. So its not a bad time to make such a change if that is indeed what is necessary.
My question is - what is the urgency. The current system would inhibit the serialization of collections of greater than 2 G objects. But as far as I know no one has yet run into that problem. So I'm curious - has the usage of 32 bit count of objects created some problem somewhere else?
We have some vectors that are larger and which we cannot serialize using Boost.Serialization at the moment Matthias