Hi Agree with Robert but for the number of records stuff. I confirm the power of the multi-archive stream startegy. To address this problem, I have a boost-based home made library that uses several archives stored within one single file (text, XML or Portable Binary, gzipped or bzip2ped or no compression) [see a thread on this list with Robert and me a few monthes ago concerning a `multi-archives' mode strategy]. But I don't store the number of archives. It is possible to do but rather tricky, mainly because one has to deal with the EOF stuff to detect the end of a stream after the last loaded archive (catch some exception on the fly using a kind of high-level `peek'). Finally, it was possible to design a generic `reader' class that loops on records (archives) in the file without the knowledge of the number of records. IMHO, storing an updated number of records induces some annoying stuff and code which can be avoided, a companion file is difficult to manage and synchronize, and a counter at start of file forbid special nice operations (see below). More I also have the tracking activated within each archive (which is very useful for my pointers and saves room!) . All the standard `boost::serialization' headers are also preserved. So, at least with unzipped files, it is possible to `cat' several archive files from a (unix) shell within only one without any problem or even extract an XML archive from the multi-archive mode with some external tools (python scripts or whatever...). Of courses, this is work. If you are interested, I can send you some sample code (without garantee, just to help). Good luck. frc --
Agustín K-ballo Bergé wrote:
Is it possible to append serialized elements to an existing archive?
the serialization library doesn't currently address this situation.
I need to keep a set of records on a file, and periodically add new entries to it (kind of like a database table). Reading the whole set, adding the new records, and serializing the set back is not an option since I expect the set to go pretty big.
I understand and sympathise.
I have tried to keep several archives at the same stream (with no_header and no_tracking flags),
I believe this might be made to work.
but then I don't know how to read them. Reading them as if they were a single archive does not work, should it?
no.
Here's an idea.
// append a bunch of archives to a file main(..){ ofstream os(..); // open for append { oarchive oa(os) oa << a bunch of stuff } // close archive leaving stream open { oarchive oa(os) oa << a bunch of other stuff } // close second archive } // now we have a closed stream with several self-contained archives in it
// read a bunch of archives from one file main(..){ ifstream is(...); // open the normal way { iarchive ia(..); // open first archive ia >> first bunch of stuff } // close first archive leaving stream open { iarchive ia(..); // open first archive ia >> second bunch of stuff } // close second archive leaving stream open } // close input stream
exercise for the reader. Figure out where to save the Number of archives inthe file. A couple of ideas 1) a separate companion file 2) save a few bytes at the beging of the file - call this archive count. Update this after every "append" and read it duning load.
Good Luck with this.
Robert Ramey
main(..
Agustín K-ballo Bergé.-
_______________________________________________ Boost-users mailing list Boost-users@lists.boost.org http://lists.boost.org/mailman/listinfo.cgi/boost-users
-- Francois Mauger Laboratoire de Physique Corpusculaire de Caen et Universite de Caen ENSICAEN - 6, Boulevard du Marechal Juin, 14050 CAEN Cedex, FRANCE e-mail: mauger@lpccaen.in2p3.fr tel.: (0/+33) 2 31 45 25 12 fax: (0/+33) 2 31 45 25 49