See I foolishly assumed that all your code was bug free :-) lol...sorry
for the nebulous report - here is something a bit more concrete.
The scenario as stated is loading a large number of objects that have
been previously serialized into a number of files. After around 150
files have been processed I hit an archive exception and stream error.
This occurs only when using the XML archive...if I use the binary
archive I can process as many files as I'd like (and in fact I've tested
many more than I actually need).
The approach is "textbook" in that I've seen this same pattern many
times in sample code, tutorials, etc. The only difference in the code
between binary and xml serialization is my load function:
XML:
#include
Date: Mon, 13 Mar 2006 18:40:32 -0800 From: "Robert Ramey"
Subject: Re: [Boost-users] Known issues loading lots of objects withxml_iarchive? To: boost-users@lists.boost.org Message-ID: There are no "known" issues. In fact the system has no known bugs either. LOL - of course the function of this list is to address the problem. I haven't heard of anyone having this problem. You might investigate a little by seeing if occurs just in xml or in other archives as well.
Robert Ramey
Christopher Gillett wrote:
I am serializing a LOT of objects, one per file. I can create XML for many thousands of objects with no issue. However, when trying to load a few hundred objects (using a separate program, btw) I am hitting a boost::archive::archive_exception with a stream error while loading around the 161st file, which I believe based on stack traces is coming from within boost::archive::basic_xml_grammar<char>::parse_end_tag ().
The approach I'm taking is fairly textbook, so I'm curious if there are known issues with loading a lot of files, etc.
Any advice appreciate.
Thanks, Chris Gillett