
On 11/12/2007, "JOAQUIN LOPEZ MU?Z" <joaquin@tid.es> wrote:
I don't think so: even if you manage to implement an iterator-conscious, order-preserving, sensible rehashing mechanism, getting new memory can throw std::bad_alloc, which is not allowed (only throwing from Hash or Pred permitted)--unless you catch such exceptions internally... but this is rather artificial. My opinion is that the writers of this piece of the C++0x standard didn't consider rehashing as a permissible operation when erasing.
If they don't they should say so explicitly as it's easy to implement a hash table that meets the requirements. For example the buckets could be an array of pointers into a list of nodes that are ordered by the hash function. Rehash then has strong exception safety and doesn't disrupt the order of the elements. Memory use will be the same, performance will be fine, perhaps a little slower to ensure the hash quality is good enough, and there will be notable gains since elements are never lost due to rehashing, iterators are never invalided by inserts, memory use can be reduced when a large number of elements are erased and iteration will be fast. And if that doesn't work, who knows what might be devised in the next ten years? Daniel