
On 10/11/07, Simonson, Lucanus J <lucanus.j.simonson@intel.com> wrote:
gpd wrote:
[...] That might have been true some years ago. Current compilers are pretty aggressive when optimizing and will rely on every dark corner of the specs as possible. And will do more so in the future.
Well, let's think about this critically. If the compiler started doing optimizations that caused code that casts between base class and derived class and back again to function incorrectly because the compiler decided the spec told it that those were different types and couldn't be the same pointer address what percentage of real C++ applications would be broken?
A lot. That's why many projects are stuck with obsolete compilers or with suboptimal optimization options.
Would it be worse than when the compiler started assuming a pointer to a float couldn't be the same address as a pointer to an int?
No, it is pretty much the same thing, and it is equally bad.
I pointed out that the banana has no wires coming out of it, yet am being told that we might one day invent wireless transmission of electricity. Even if we did I'm not convinced it would be used to electrify this particular banana. We can always tell the compiler guys "hey, I'm eating this banana, please don't electrify it."
They'll say "Too bad, we told you we could have electrified it sooner or later. You should have listened to us". It has already happened and will happen again. A couple of months ago there where an "interesting" thread on the gcc-devel mailing list about signed int overflow behavior: some application developers argued that optimizing assuming undefined overflow behavior would break lots of applications. The answer they got was pretty much: "then stop relying on undefined behavior!". Sure, they might add an option to disable such an optimization for a while if you ask nice enough, but: - the option won't be there forever. - would you use a library that forces you to a lesser optimization level?
Like I said before, if the compiler did what you suggest they would just end up rolling it back because it would break too many things.
Type based alias analysis is here to stay. I wouldn't be surprised that such code could already break on current compilers given the right circumstances.
People are doing egregious things with casting completely unrelated types back and forth all the time.
It is their application and the can do whatever they like, and they will regret it when their application will break. I do not think we want to consciously introduce these kind of bugs in a boost library. gpd