Alright, let's asume you are right.
Can you explain the following behaviour: i create a vector<string> and fill
it with a huge amount of strings, so it takes up to ~50% of my RSS. After
the vector is filled, i observe the RSS and as expected, it's quite big.
Then i clear the vector and let it run out of scope and something special
happenes: The RSS is decreased by a significant amount (close to what it was
before i created the vector)! This should not happen in your scenario,
right, since the OS does not need the memory, but it still does get
reclaimed by the OS.
So i'm really sry, but this procedure, where memory is sometimes hold on to
and sometimes not does make no sense to me. I do understand that the OS
memory management tries to optimize the access onto the memory and hence not
all memory might be reclaimed right away, but not in this scale. I've never
seen programs holding on to memory forever until the OS reclaims it. And im
not talking about a few bytes here and there, i'm talking about hundreds of
megabytes.
By the way, the behaviour, described with a vector<string> also works with a
vector