I'm am not aware of serialization causing such a problem. You might investigate std::vector resize(), etc to see if the vector really has a lot of null data. Robert Ramey Sascha Ochsenknecht wrote:
Hello,
I'm using the Serialization Library of Boost to store my data structure. I want to use the binary archive type by default: boost::archive::binary_oarchive(ostream &s) // saving boost::archive::binary_iarchive(istream &s) // loading
But I noticed that these files can be very big compared to the stored data. I got a binary archive with around 1.5GByte. That could be but when I compress it I got only ~200MByte left (!). It seems that there is a lot of 'overhead' data or 'redundant' data (I see a lot of '0' when I look into it with an Hex editor).
i tried the gzip (...) filter of the Iostreams library, but I want to avoid this for production due to increasing runtime.
Some Information about my data structure (maybe helpful): - using a lot of pointer - using a lot of std::vector
Does anybody investigate the same problem? Is there a possibility to decrease the archive size but storing the same amount of data? What could be a solution? Writing an own/optimized (regarding to my data structure) Archive class?
thanks in advance Sascha