
As a general comment.. I think having a pool of bug-fixers is a wonderful idea. And is frankly, one of the requirements of Boost staying afloat. On 10/30/2010 3:05 PM, David Abrahams wrote:
At Sun, 31 Oct 2010 00:22:25 +0530, Arindam Mukherjee wrote:
I am one of those hopefuls who responded on the thread that proposed the idea for volunteers. I have always wanted to understand and contribute to the Boost libraries because I felt that it would give me an insight into the design and implementation of Boost (and perhaps the C++ standard libraries themselves) to an extent that I lack today. And I am certain greater participation can only mean good thing, provided we have answers to the following questions (or at least know where to start in trying to answer):
a. What are the concrete criteria for admitting a volunteer - where do you set the bar. These must be verifiable objective criteria.
I don't think we can really come up with objective criteria. Each library maintainer has his own set of values and his own style, and—at least if the maintainers are going to be involved in the decision—contributions mustn't clash too badly with that style and set of values. Therefore, criteria for accepting contributions, if not contributors, will be, to some extent, subjective.
I agree mostly ;-) Like regular reviews for library submissions, and also for GSoC students, I think we can use similar criteria for what makes a good volunteer. I.e. we can have a vetting process for volunteers and their contributions. IIRC for libraries we only really require that people test their code on two toolsets locally. And of course we do reviews of the library as a whole before initial inclusion. The GSoC guidelines are a bit more fluid.. I expect some demonstrable knowledge of the problem domain and of C++. I have seen other participant organizations require some form of contribution to the project before considering a student. And something like that I would think is possible in this case. So here's an initial set of criteria / process for this: For volunteers to get SVN write access: 1. Must submit some minimum number of patches to Trac. 2. Some minimum number of patches to existing tickets must be accepted, reviewed, applied, and tested. I.e. a new volunteer would turn bug tickets into patch tickets to get this started. 3. A single patch must be reviewed by some minimum number of existing contributors. And either blessed or not for application. 4. Patches must be locally tested on some minimal number of toolsets. Either multiple toolsets on one operating system, or preferably multiple toolsets on multiple OSs. It would be up to the reviewers to decide if the tested toolsets are sufficient in the context of the particular patch. Any regular maintainer, including existing volunteers, can help with the above. The hope being that we can start small and have this grow itself without increasing the burden on current contributors too much. Some possible numbers for the above: (1) five submitted patches, (2) three applied patches, (3) two reviewers for a patch, with high, preference for the library maintainer to be a reviewer but not required, and (4) two toolsets. After volunteers have write access we would want to still monitor their patches so we would want to keep the review of their patches. So perhaps after some number of closed tickets we would remove the review portion with the expectation that they would be responsible enough at this time to seek out reviews for non-trivial, or non-controversial, patches.
b. Do we have a process in place which makes the induction of volunteers easy - how easily can a new recruit get down to the business of fixing the bugs? Part of it depends on the bar you set in (a) and part of it depends on the process you set. For example, the volunteers at the least need to know the bug-fixing process that is in place today including tools, reviews, etc. How quickly can this knowledge be imparted.
Well, as you might have guessed there really isn't a process for bug fixing other than what is required for release management. Essentially it's mostly up to the individual maintainers to deal with it as they like. Hence my suggestion above for the process :-) As far as tools go though, we do have a rather fixed set of requirements for the testing part of this. And it's fairly easy to explain.
c. As somebody already mentioned, to what extent can you provide mentoring and who does it.
I think we can at minimum have the same set of contributors that we have for GSoC help out with the mentoring of this as they (a) tend to have the desire to help out, and (b) they tend to be the most broadly knowledgeable of Boost Libraries. Which I think roughly means 15 or so people to mentor at the start.
d. Finally, would someone assign tickets to volunteers - I feel this would be a better idea than letting people pick and choose when the volunteers start off. The process could get eased off as a volunteers spends more time with the code base and therefore gets more familiar.
Assigning tickets might be a hard task for contributors to do initially as it might take a considerable amount of time to actively find tickets. And also would make it harder on volunteers as the particular domain might be out of their realm. Perhaps it might be better to have contributors mark tickets as candidates for volunteers to take on. Immediately what comes to mind are tickets for platforms that maintainers don't usually have access to as good candidates for this ticket pool.
I am sure the questions are easy to ask and there are logistical hurdles to take into account in trying to answer any of these questions.
Can you suggest some answers, even as straw men? We need a place to start.
Hopefully the above is a good place to start. Note, I wrote the above with the background of spending a few release cycles years ago doing nothing but fixing test failures. -- -- Grafik - Don't Assume Anything -- Redshift Software, Inc. - http://redshift-software.com -- rrivera/acm.org (msn) - grafik/redshift-software.com -- 102708583/icq - grafikrobot/aim,yahoo,skype,efnet,gmail