
When I had just 420 data samples it all worked fine: I create a vector<Sample> in memory, serialized it and loaded it in the other side. When I moved to 7000 data samples the first program, the creator, sucked all the memory from the machine and then some.
7000 samples? That doesn't seem very large in today's environment. I can't see why creating an archive should consume any significant memory at all.
Sorry if it sounded like the archive was the problem - it was holding 7000 samples in memory in a std::vector along with all the other stuff the creator program also holds in memory that was the issue. So I needed to move to only holding one sample in memory at a time.
I think I changed the assert/exception to check before loading rather than after for the next version. This would permit your code above to work as you expect.
OK.
What is the best way? Is adding a terminator byte to boost::serializer an option (i.e. so I don't have to make a terminator version of each data structure I want to serialize).
Actually, adding a terminator value for the Sample class took 15 minutes and worked first time. So perhaps it is less fuss than I originally thought to add it for each class I want to archive in an open-ended list like this. Darren