
"Johannes Brunen" <JBrunen@DataSolid.de> wrote in message news:hsdp2o$n5m$1@dough.gmane.org...
Hello Robert,
"Robert Ramey" <ramey@rrsd.com> schrieb im Newsbeitrag news:hsbnvs$ui7$1@dough.gmane.org...
I just did read the slides and I very much like your proposals, especially the two phase review proposal. It would imho really a great advance.
Did I get it right that the testing should only cover the certified libraries?
My proposal is in large part driven by the ideas: a) C++ needs more libraries which meet a high standard of quality b) Boost is currently the largest source of libraries c) The above suggest that boost needs to prepare for growth d) The unique success of Boost rests on it's quality. This in turn rests on i) Review process ii) Testing e) Current boost practices don't scale - it's harder to work with as it gets bigger My proposal attempts permit the Review and Testing process to grow along with an anticipated growth in libraries. (BTW - Boost isn't really broken - it's a victim of it's own success. I just wanted to increase attendence at my talk) The "Two Phase" acceptance is designed to expedite and improve the review process. I want to see more testing - but it's already a huge job. It can't continue indefinately in this manner. Nor should it. So I want to see testing distributed to library users. Thus, the testing will be spread accross the universe of users interesting in each library. This will result in the following benefits: a) Testing will scale. The number of "testers" will be proportional to the number of library downloads. Only test results will be gathered in a central place. b) Testing will occur on all the combinations of platforms actually used. c) It's very difficult to help users unless we know that everything is installed and working. Many users ask for help when it turns out that they don't have things installed correctly. d) When I user comes to the list to ask for help - example - he want's to serialize his data structure through a virtual base class using the new CLang C++ compiler configured as a cross compiler modified to generate code for the amd black fin processor which is inside a digital camera that has it's own micro linux OS - how can I help him now? I can't (for free anyway). But in the future, I can say oh what are the results of testx, testy, testz on your system. Test x fails? oh checkout code scrapping on your compiler. All tests pass? Check your own code against these tests and look at the documentation page ... Oh the documentation is hard to read - feel free to submit an improvement. Support is easy for me. e) It will give users a stronger sense of participation and commitment to Boost.
What happens if one central library loses its certified status and other libraries are depending on this one?
This is an unresolved issue. Already we have some problem in this area. A couple of libraries depend on codecvt UTF-8 stream facet. And this code has never been subjected to review. In my view this has never been satisfactorily addressed.
Should if also be possible to deploy all the accepted and not certified libraries at once?
I don't think it should be necessary. I'm hoping that the result of a proposal such as this would be there are too many accepted libraries to make this practical. But the decision to include such a facility will be in the hands of whomever packages the libraries. In the future, anyone who want's to make a package of boost libraries will be able to do so. So he'll be able to decide what he want's to include and/or exclude. Here are some examples of packages that people might want to create: a) All boost "certified" libraries. b) All "certified and accepted" libraries c) One library - and all it's prerequisites d) All boost tool. e) All libraries which have median ratings of "good" or better in all dimensions. e) create your own package here .... Of course, for the forseeable future, the main package built by boost will be a) above.
I would like to see the additional requirement that for getting the acceptance status, a library should not only compile on two compiler platforms but on the current major platforms (e.g. gcc and msvc [open for discussion...]).
This is a level of detail that hasn't been arrived at yet.
Might it be neccessary to install a lifetime policy for the libraries classified as accepted?
I see the problem that libraries are written against very old platforms and are practically unusable on current platforms.
There has been discussion regarding libraries which aren't well maintained. Various policies have been proposed and would be possible. I've stayed away from getting very involved in these discussions. My view is that alot of the demand for "a better policy" would dimish with better information and feedback. In my world, when a library starts to suffer the symptoms of neglect, it will show up as a lot of negative comments and ratings on the acceptance page. This will diminish demand (and downloads for the library) and it will eventually be a no-brainer for a packager to drop the library from his package. And suppose it hasn't been downloaded in a year? It's not being updated, everyone who depends upon it already has it. No one would complain when it's dropped. So we wouldn't need to come to agreement on a policy for this and lots of things. To summarize, I believe a comprehensive system for getting information back from users will go a long way to addressing a lot of issues. Robert Ramey