
Dave Harris wrote:
Fourthly, there seems to be a design philosophy behind the current proposal which is not stated explicitly in the documentation. It apparently favours some notion of safety over speed and hash quality, where "safety" means that different objects should have the same hash values if they look at all alike to someone possibly.
"Safety" is not the correct way to put it. "Predictability" or maybe "stability" is the proper word. This is also the reason to fix the implementation, not usually the case when aiming for standardization. The problem is that hash functions vary wildly in quality. Leaving the default function unspecified (or implementation defined) is good if you are lucky and the implementation you happen to use is good. It's a tradeoff. A fixed and predictable default means that you don't have to measure the quality of the hash function for every implementation to which you port your program. Your original tests are still valid. The other side of predictability - that the hash function is not affected by types or the particular representation of a sequence of values - means that you don't necessarily have to re-run the performance benchmark of your unordered_maps when you refactor the key to use int[3] instead of struct { short, short, short }, or a wstring instead of a string.