-----Original Message----- From: boost-users-bounces@lists.boost.org [mailto:boost-users-bounces@lists.boost.org] On Behalf Of Max Motovilov
I've argued both sides of this issue to death in the past, so I'm certainly not going to do it again. Suffice it to say that if you cannot fix the defective code (typically, because it is part of the codebase you are not expected to modify -- for example, part of the library headers shipped with your compiler, or a 3rd party library you have to use), and you cannot direct the tool to work around it then you have to throw away one or the other. The real world choice is often in favor of the bad code, not the good tool. I am not saying that it's right, just that it's the reality.
I'm well aware of the status quo. I'm also aware that when users can easily take backdoor outs that "solve the problem", the problem never actually gets solved. As you say, it is a matter of balance, but in this case, we are taking about a preprocessor whose primary goal is to be conformant to the standard.
Note that Boost libraries (and their developers, which, I understand, means you too :) ) go to great lengths to preserve compatibility with as many compilers as possible, despite sometimes having to work around glaring bugs and/or incompatibilities with the standard. Otherwise the number of Boost users would have been a lot smaller, and irrelevancy is a bigger risk than indirect support of bad practices. This is obviously a matter of balance, but I imagine that the heaviest weight on the opposite side of the scales comes from the expense of supporting the workarounds, not from the fact that those workarounds somehow encourage people to use broken compilers.
It isn't just users. Workarounds in library code encourage implementors to not fix their compilers and thus the situation propogates. As always, other things become priorities--like changing the language however they see fit. Many compilers like to be able to say they support the Boost libraries, but, in reality, it is the Boost libraries supporting them. This limits what Boost can do significantly. I'm fully aware that it takes a broad user base to gain the clout that Boost currently enjoys. However, it would be nice to have a clean Boost library in parallel to the hacked to death Boost library (which is what we have now), and challenge implementors to support that. Back to the concrete case of a preprocessor--more specifically, a preprocessing library. There are many preprocessors out there--full of bugs and permissiveness. The fundamental purpose of Wave is to be a strictly conformant preprocessor. What you ask for violates a fundamental principle of Wave's existence. If you want a preprocessor that emulates some other preprocessor--just use that other preprocessor. Having to retokenize the output is insignificant compare to intentionally breaking another tool. As a library author, or a tool maker, for that matter, I'd rather have fewer, but higher-caliber, clients than many clients that don't care about ideals like Standard C++.
Programming after all is not an art into itself, it is a process of building software for a specific purpose, and with a whole lot of constraints (budget, schedule, learning curve, irrational preferences of the management -- you name it...).
It may not be an art unto itself, but it is an art nevertheless. I don't care to support any any programmer or organization whose goal is only getting the job done instead of getting the job done right. It reminds me of some industries that intentionally make inferior products which cost the same to make as their better products just so they can sell them for less and justify the prices of their superior products. I don't care to support that mentality.
The fewer constraints are violated by a specific tool, the more likely it is to be used. That's all I'm saying...
I understand your point of view; I just don't agree that this is a case where it is absolutely necessary. I also don't think that anything will ever change unless people draw a line in the sand. Over the years, implementors should have broken code incrementally, rather than intentionally preserve broken code. There should be no option in VC++ to change the 'for' loop variable scope, and they've known about the correct 'for' loop scope for many years. It isn't even a transition path--it will probably be around forever. How much more broken code has been written in that time because VC++ has allowed it? Say, for example, that I was implementing some library, and this library was working only because of a compiler's (or a few compilers') permissiveness. (Normally, I'd just run some tool on it that is strict to make sure it is right, but such tools don't exist because they all emulate the permissiveness of said compilers.) So, the code doesn't get fixed--maybe I don't even know it is wrong. Users don't have a serious problem because they just switch on (assuming it isn't the default) the allowance in their tools, so I never hear about it as a significant problem. Now some other library author produces a different library, but my library is popular, so this library has to be compatible with the set of allowances that my library requires. It propogates endlessly. (Raise your hand if you like Microsoft's min/max macros or the utter pile of crap that constitutes the Windows headers.) Add another library and another library, and eventually they all become crap because they are so burdened with legacy compatibility. They continue to function, but they don't improve. There should only be C++ (not Microsoft C++, not Borland C++, not GNU C++). Any code that requires allowances by a compiler is not C++, period. Of course, compilers aren't going to be perfect, but there is a difference between intentional permissiveness and bugs. For the sake of portability, we will always have to work around bugs, but we shouldn't have to workaround permissiveness, nor propogate it. Regards, Paul Mensonides