-----Original Message----- From: boost-users-bounces@lists.boost.org [mailto:boost-users-bounces@lists.boost.org] On Behalf Of Max Motovilov
the long-term cost. Maybe we just disagree about how significant that long-term cost actually is.
We probably disagree about the short-term costs, not the long-term ones. The long term costs are an incremental and mounting burden, short-term costs are a barrier to entry.
Yes, but not an unbreakable barrier.
E.g. if the new compiler (VC7.1 for the sake of the argument) does not compile the existing code base, the management keeps everybody stuck with the old one (VC6 in this example)
Which is a situation that can only exist for so long before management is forced--the hardware industry isn't going to stop making new hardware because of this, and nobody is going to update VC6. Also, in most cases the existing code base can be tweaked in minor ways that still compiles on the old compiler until it will compile on a new one. In the case of the scope of for-loop variables, it doesn't take that much to update massive codebases. Similarly, it takes even less time to fix erroneous macro redefinitions. If some library code that you aren't legally aloud to modify requires the permissiveness, complain loudly to the authors of that library. If VC7.1, for example, didn't allow the old for-loop behavior, things would change faster than you might think. It wouldn't be just one client complaining to the authors of that third-party library. Even if the library authors refuse, the market will produce alternatives. I realize that it isn't as easy as I'm making it sound, but sometimes sacrifices are necessary. Sometimes more effort must be expended in the short-term to reduce effort in the long-term. The willingness to do so, coupled with idealism and curiosity, is what ultimately drives technology.
and bye-bye Boost, or most of it anyway. Yes, in principle it would have been better to resolve a few minor things such as incorrect scope of 'for' variables, or whatnot, but it's anyone's guess how it will turn out in every particular case.
Any given particular case is unimportant when compared to the general case and the effect on the whole, especially into the future. It is unfortunate that a great deal of management is so short-sighted, and it is unfortunate that some programmers will be "left out in the cold", as you put it. But that is unlikely to be the case in most situations--not management being clueless, but management not being eventually forced. Of course, this is entirely hypothetical scenario where (e.g.) Microsoft is willing to break client code to do what is right--which isn't going to happen; they have a long and continuing track record of doing what is wrong. In this particular case, Wave, we are in a unique position in having control over an implementation, and we are at a point where we (ultimately, meaning Hartmut) can choose to stand up for what is right or choose to propogate the status quo.
VB-like, it may have come out higher in number of users than it has now.
Well, as we both know, the beauty of C++ is that it can be as VB-like as you want.
Right, but it does take some background work and some self-enforced limitations.
Didn't take me all that long to get all of VB's syntactic simplicity of access to COM and dispatch interfaces (special thanks for PP and enable_if go here...) -- too bad I can't put that library into open source right now...
Interesting. This has nothing to do with anything, but way back I used to use VB, and I was continually knocking my head against the wall trying to do some things. So, I moved to C++, but it wasn't a direct move. Instead, my first projects in C++ were COM components (the manual way, not with ATL) so that I could easily use them from VB. Of course, I eventually recovered from the VB disease. :) Nowadays, I think there are better ways than COM or CORBA for writing reusable components. (And, incidentally, those better ways don't involve making all languages a thin veneer over one runtime.)
The biggest problem is that users often just don't know what functionality and convenience are available in C++ for the price of asking and C++ is still thought of as a strange monstrosity fit for a select few, and, if it cannot be avoided altogether, it is best "managed" [pun intended] by using design and coding practices dating back ten years or worse.
Yes.
I wonder if the best thing that could possibly happen to C++ at the moment would be a 4th edition of Bjarne's book, based on Boost in the same degree as 3rd was based on the standard library. I know there are good recent books that explore certain corners of modern C++ thinking, but they somehow don't seem to resonate. What little I've seen of college-level C++ curriculum looked quite atrocious.
I doubt this will ever change regardless of what books exist. I don't believe this is always true (and it is nothing but my own conjecture, at that), but the reasons good mathematics and science professors teach is often so they can work on their own projects or studies with the facilities provided by a university. The same kind of environment typically doesn't exist in computer science. Nowadays you don't need a school's facilities to work on your own projects. You don't need particle accelerators or electron microscopes. Supercomputers are usually only necessary when your projects aren't about computer science, but about something else that happens to use computer science.
vendors making C++ more VB-like anyway.) The point is that popularity is not the only measure of success. In fact, popularity is often an indication of lower quality (e.g. MFC and pop music).
Unfortunately, while popularity may be a poor indicator of quality, it is a very strong indicator of relevance. Almost to the point of being one and the same thing ;)
Relevance to what, exactly? I don't mean to be snide; I just want a clarification. As I've said before (not just in this conversation), I don't care to support abject stupidity. If that means that less people use what I write, so be it. (I'm certainly not trying to imply that I am without flaws, BTW.) I will not sacrifice my principles for the sake of popularity or wealth--even if the alternative is marginalization. I aware that I'm taking the hardline here. We'd be in a lot better position if compilers had done so, and the situation will only get worse as time goes on--if anything will ultimately kill C++, it is this. Again, I'm not advocating the removal of workarounds in source code--that isn't practical. I am against adding permissiveness to and for removing them from implementations (i.e. workarounds *for* source code). Once the permissiveness is there, it becomes a normal feature to users (rather than a compatibility hack). Furthermore, the permissiveness required by a library forces all users to give up the checking that a less permissive tool would give them.
I've done it for the part of Boost that I consider my responsibility. (As usual, my problem is documentation--very important--which I hate writing and apparently suck at anyway.)
I don't think Boost-PP in particular suffers from deficient documentation. Again, what it needs is a massive how-to, which is best delivered in form of a [chapter of a] book.
Which is where it is deficient, and that is the part that I find the most difficult. Specifically because I find it difficult to put myself in the shoes of those without my expertise. When I have a conversation with a particular person, I can gauge their understanding of what I'm saying by their responses. With documentation, the audience is abstract and the conversation is far less tailored to need.
On the other hand, Boost-PP is more of a toolmaker's tool than an end-user's tool anyway...
Yes, it is.
My first observation based on that experience is that the pp-lib is stagnant. There isn't much I can do to improve it (even with massive workarounds) without much better implementations as the common starting point.
I think for starters it would be nice to get it better adopted by other parts of Boost (not a dig at you by any means, but at those who can't be bothered to use it).
That's fine; but that the pp-lib remains stagnant even with greater use. As far as preprocessors are concerned, one by one things are getting better. Metrowerks used to be horrible (if not the worst), but now it can handle most of Chaos. EDG's new front-ends (not yet in an official release of Comeau, BTW) can handle it too and should be significantly faster. GCC can handle it as of the last several releases. Walter just rewrote the Digital Mars preprocessor, and it is a 100% turnaround. Unfortunately, there are still some that must be supported by Boost, like VC, that aren't even in the ballpark. If the pp-lib were to become incompatible with the needs of the rest of Boost, the rest of Boost couldn't use it, and as you mentioned, it is a toolmaker's tool.
There's just no excuse for a library to introduce an arbitrary, unconfigurable limit on the number of supported arguments in a user's function, or length of a [variable by design] list of template parameters etc.
I agree. Regards, Paul Mensonides