On 22 May 2014 at 9:23, Tom Kent wrote:
Long term over short term. Short term is to workaround every compiler bug under the sun--which is what Boost and many other libraries have done. The result is a mess. It provides little incentive for compilers to be fixed, and it causes every "portable" library to be written either in LCD C++ or be hacked up with alternate implementations for the ten different languages (i.e. compiler dialects) that it is targeting.
As I see it, this is the crux of the conflict here.
Eh, not really actually. The dispute is whether a monolithic collection of libraries encompassing all is best, or instead clearly separated groups of libraries based on a more modern methodology instead of constant LCD to decade old practices for a nimble Boost which hasn't existed for a decade. Compiler workarounds are just as valid under C++ 11 as under any preceding version, just hopefully there will be fewer as there are fewer compilers in use nowadays and clang intends to substitute for all major compilers completely.
Users: Want a boost that *is* supported on (nearly) the LCD C++.
That would depend on the user, and in fact I would say that in terms of number the users and developers are close to even in their preference for C++ 11. For example my current day job the code base is pure C++ 11, minimum compilers are GCC 4.8 and VS2013. Boost was used mostly as a shim for bad C++11 STL implementations in the past, and is now redundant and is being phased out so we'll eventually use ASIO and no Boost. I'm hoping later this year to write a clang rewriter for that codebase which does a Boost to C++ 11 STL conversion like Python's 2to3 tool. With that tool a Boost fork becomes possible.
traits), but mostly want bug fixes for issues they find.
No, a majority wants bug fixes without paying for their cost. A minority would like speedy processing of bug fixes they've submitted.
Each library would publish a list of what other libraries it depended on and what compilers (and platforms?) it supported. This would go in a file at the root of its git tree.
And here we flog that old and very dead horse of C++ package management yet again. Dave even went off and wrote code to implement a C++ package manager, it's since been abandoned. Your comments are well intentioned, but C++ package management is very hard, much harder than it looks. A reasonable compromise is probably C++ Modularisation, even that applied to Boost would be a ton of monotonous work people won't be willing to do for free.
We would then provide a tool to the users (like 'b2 headers'/BCP on steroids) that would take a list of libraries that the user wants and the compilers that they want to use. It would then download (or fail the dependency check) these libraries and their dependencies, and build them (or download binaries). This would also work very well with the various linux package managers as they would follow the same dependency pattern with their own tools.
This would allow developers to introduce new libraries without having to worry about supporting older compilers. Users of old compilers would know, up-front, that this library isn't available for them. If the library caught on and there was user demand, support for older versions could be added later, if possible. The bar would be substantially lower for getting a new library into boost, but I wouldn't go as far as the original proposal went with only requiring automated tests to pass. Libraries would still need to go through a review process so that users can be assured that libraries in boost of high quality.
We would still require having coordinated releases of all the libraries for a specific version number, however the release would simply be providing a file that lists the git tag for the version of each library that makes up the release. The toughest part of this would be making sure two libraries that both have a dependency on a third library are dependent on the same version of that library. Breaking changes in libraries that have dependencies would have to be communicated to the dependent libraries well in advance of a release.
Overall, the biggest problem I see with this proposal is testing. By being modular, individual test times would be greatly reduced. This would hopefully enable CI like testing at each checkin, which isn't currently possible when it takes 8+hrs to run the tests on some platforms. However, as each library would be specifying exactly which compilers it uses, we would need a way to make sure that those are all getting hit. We currently don't have enough tester diversity to completely accomplish this.
------
If this could all be made to work, it would let advanced users have the advanced features that they want, while keeping the features that the normal users use a lot fully supported on the platforms they need. But it is just one idea...I'm sure there are lots of holes in it.
I think you'd find the work to implement all of the above exceeds the work of a C++ 11 only fork because you're constantly wrestling with the monolithic legacy e.g. dependency breakage, unit tests which don't CI well etc. I also proposed in my OP that we skip package management as it's too costly to implement correctly, and go for per-library source distros instead which pushes the version lock problem onto users to solve. We can all wish for dream solutions, but in the end what we must adopt is what is reasonable given people work on this for free during family time. Niall -- ned Productions Limited Consulting http://www.nedproductions.biz/ http://ie.linkedin.com/in/nialldouglas/