Re: [boost] [serialization] a proposal for an alternative to the newconst-saving rule

----- Mensaje original ----- De: David Abrahams <dave@boost-consulting.com> Fecha: Viernes, Junio 24, 2005 3:49 pm Asunto: Re: [boost] [serialization] a proposal for an alternative to the newconst-saving rule
Joaquín Mª López Muñoz <joaquin@tid.es> writes:
The new rule introduced in Boost.Serialization that forbids saving of non-const objects has proved a little controversial. My point of view is that the rule, albeit far from perfect, provides some level of safety against hard to track errors. Others' opinions differ.> The current rule is a rough approximation to what IMHO would constitute the right enforcement: everytime a trackable object is saved, check wether an object with the same address has been previously saved and, if so, make sure that the object didn't change. I don't understand what bugs this is going to catch. Certainly in this case,
for( ... X x(... ); ar << x;
the low-level problem isn't that x is changing, but that it's a different object each time.
Of course.
Two temporally different x's might well be identical.
Yes, but saving the same unchanged object twice, which is perfectly legal, passes the hash test, so the rule admits all valid cases and detects at least some of the invalid ones --in practice, most of them, actually.
But all of this misses the high-level problem: the author of the code doesn't know what he's doing. You simply can't serialize objects from distinct scopes with tracking into the same archive, because there may be aliasing.
Totally agreed, this is what we are trying to detect in order to protect the author of the code.
And there's nothing we can reasonably do to detect that problem when the aliased objects have the same type
Nothing? I'm afraid I don't get you. A perfect aliasing detection mechanism is probably impossible to implement, but the hash test at least approximates it. This is better than providing no safety mechanism, as I understand you advocate. Or put another way: if the hash test fires the alarm, we are *sure* the user's code was incorrect. And if the user's code is correct, the hash test will pass. If you allow me to draw an analogy, this is similar to primality testing in maths: a 100% accurate primality test is complexity-wise unfeasible, but many statistical tests are fast, do not yield false negatives and get most of true negatives right. Am I being too cryptic? Sometimes my English skills play tricks on me. Joaquín M López Muñoz Telefónica, Investigación y Desarrollo

"JOAQUIN LOPEZ MU?Z" <joaquin@tid.es> writes:
De: David Abrahams <dave@boost-consulting.com>
But all of this misses the high-level problem: the author of the code doesn't know what he's doing. You simply can't serialize objects from distinct scopes with tracking into the same archive, because there may be aliasing.
Totally agreed, this is what we are trying to detect in order to protect the author of the code.
And there's nothing we can reasonably do to detect that problem when the aliased objects have the same type
Nothing? I'm afraid I don't get you. A perfect aliasing detection mechanism is probably impossible to implement, but the hash test at least approximates it. This is better than providing no safety mechanism, as I understand you advocate.
I'm not sure it is. There's an imposition on users: all the types they want to serialize have to support hashing. It is nice that the serialization library automatically takes care of hashing aggregated types and leaving out the unserialized data... uh, wait: this will never work unless you plan only to do shallow hashing. Otherwise you will get an exponential explosion for some object graphs. Is that your intention?
Or put another way: if the hash test fires the alarm, we are *sure* the user's code was incorrect. And if the user's code is correct, the hash test will pass.
If you allow me to draw an analogy, this is similar to primality testing in maths: a 100% accurate primality test is complexity-wise unfeasible, but many statistical tests are fast, do not yield false negatives and get most of true negatives right.
Am I being too cryptic? Sometimes my English skills play tricks on me.
No, the principle is familiar to me -- you don't need to try to explain it. I'm getting more comfortable with the idea; it was hard to accept at first because it looked too much like using const at first, but in retrospect it is starting to look like a good idea if the hashing is shallow and you can turn the checking off. -- Dave Abrahams Boost Consulting www.boost-consulting.com
participants (2)
-
David Abrahams
-
JOAQUIN LOPEZ MU?Z