If you have plenty of memory (especially virtual memory), then just make it huge, and don't worry about using it all.
Actualy, my data will take about 26 Gb... Tha's why I want to control the memory.
there might be a bit of extra bookkeeping in the vector "header" itself.
That's one of the problem.
a. vectors tend to allocate extra space (think "reserve()" and "capacity()" vs. "resize()" and "size()").
reserve(1) or reserve(5) take the same amount of memory.
b. chunks of memory are often larger than actually requested (to improve performance, or to stash housekeeping information with each chunk).
Yes. I'd like to control this behaviour.
So whatever value you get for "total_memory", remember that you need to increase it by some amount.
As we are supposed to provide an exact value for the shared memory, there should be a way to know exactly how much memory consume the data structures.
If you have the memory to spare, go for double, and you should be fine; if you're tight, try maybe 20% more, and pay extra attention to your exception paths.
I can't... Thanks a lot for your contribution. I switck back on an implementation based on pointers instead of vectors.