
On Mon, 23 Apr 2012 14:08:02 -0700, paul Fultz wrote:
Of course, a more efficient approach is done in Chaos, but perhaps the recursion backend could be implemented in MSVC, but it will be slower, which is ok since I don't develop on MSVC, I just need it to build on MSVC. I will look into this further.
Both Boost.Preprocessor and Chaos are open-source distributed under the highly nonrestrictive Boost license, so you are free to fork off of it as you please. You can do that now, regardless of whether Chaos is part of Boost or not. However, the root problem here is VC++ itself. VC++ needs to stop creating the problem rather than have every client of VC++ working around the problem. In the time it would take to implement Chaos-level functionality (or at least approximate it) using VC++, I could literally write a preprocessor. Politics aside, it is far more work to do the workarounds in the clients--in this case, even with just one client VC -> PP-L, than it would be to fix VC++... even if that fix involves re-writing the macro expansion algorithm entirely. If VC++ won't do it, one is better off changing the root by using a better toolchain or integrating a better preprocessor into the toolchain (such as Wave). Only a couple of days into discussing adding a no-workaround library like Chaos to Boost, and we're already branching to apply workarounds... And these are my main issues with moving Chaos to Boost--no policy to govern (i.e. disallow) the application workarounds and no means of stopping the fragmentation implied by branching the code. The current situation is actually better than that. You have a portable subset of functionality (Boost.Preprocessor) and a cutting edge, but non-portable, superset of functionality (chaos-pp). If (unaltered toolchain) portability is required, than use the portable subset. If not (i.e. you are willing and able to change toolchains or integrate another tool into the toolchain), then use the superior superset. Despite my ranting, VC++ as a whole is worlds away from were it was circa VC6. However, this appears (to me, at least) to be driven entirely by the bottom line--which only prioritizes based on corporate image and user base size (which is insignificantly small for pp metaprogramming as long as Boost continues to provide workarounds for everything). The bottom line is important, but it is not the only thing that is important. I'd rather have a complete and correct language/library implementation than "auto- vectorization" or AMP (neither of which I'm saying is either good or bad though I'm very much *not* a fan of language extensions and language subsetting). The priorities should be: 1A. Correctly interpret correct input or fail. (REQ) 1B. Generate correct output. (REQ) 2. Correctly interpret all correct input. (REQ) 3A. Improve performance of output. (QOI) 4A. Improve diagnostic quality. (QOI) 4B. Improve performance of toolchain. (QOI) 5. Create various tools such as IDEs and libraries. (AUX) The problem with VC++ is that it doesn't appear to prioritize (1A) enough and (2) appears way down the priority list if it appears at all. (5) should not really involve the toolchain as far as (1) and (2) are concerned. Instead, what appears to be the case is that (5) is hammered (and usually in a marketing way) in lieu of (2). Tied into that is the long release cycle that incorporates everything under the sun (.Net, VS, C#--and, no, I don't care about C++/CLI or whatever it is currently called). Herb has made indications that this may not continue as it has, but I'll believe that when I see it. Currently, there are other toolchains available on Windows that are more successful at (1A) and (2) though possibly not always (1B) (I haven't used VC++ significantly for some time, so I am not aware of the number of output bugs.). Regardless of degree of success, there are other toolchains available on Windows whose *order* of priorities more closely reflect the above. For me to even consider providing VC++ workarounds in any code that I write, not only does VC++ have to achieve a reasonable level of success at (1) and (2), I also have to believe that the MS's order of priorities going forward is sound--which I don't. MS has dug itself into that hole, and to me the presence of stuff like AMP--which may well be a good research direction--speaks volumes about the prioritization within MS. Now, it may be that my perception is wrong or that things have changed, but my perception and principles are ultimately what I use to make decisions. If more software development operated according to similar principles, the cost of initial software would have been higher, but the cost of software (relative of course to what the software does) would be less in the long run (which, by now, would be now). VC's preprocessor would have already been fixed, autotools would not exist, etc.. Related to the above, if backward compatibility must be broken, break it as soon as possible. Regards, Paul Mensonides