tweaking the review process (was: signals2 review results)

On Thu, Nov 20, 2008 at 11:40 PM, vicente.botet
I would like to make some suggestion to improve the review management:
Thank you for starting this discussion.
* I had notice only at the end of the the review that the review tokes place on two mailing lists (devel ans user). Maybe this is usaual on Boost reviews but I was not aware. It will be more transaparent if all the reviews were posted to the same mailing list, maybe a specific one should be created.
Yes, this is a bit of an issue. I mentioned that both lists are used in my post that opened the review (in my notes to first-time reviewers) to try to give people a heads-up (but... I often get long-winded and I'm sure it's easy to miss parts of my posts :-)). Having a dedicated, or at least recommended mailing list (either dev or user) might be a good thing. The only problem I see regarding a completely separate mailing list is that it requires active effort to sign-up for and follow. I think that for many people (even those that are genuinely interested in the library), the fact that the review is happening at time X-Y very easily slips off the radar. Having the e-mails constantly appear in the common lists also serves to remind everyone of the ongoing review, and I suspect it leads to many valuable impromptu comments and even full reviews from people that perhaps weren't originally planning on participating in the review, but end up following the review because it happens on the list (I recall being in this situation myself several times).
* From the 5 committing reviewers making this review possible only 3 made a review and 2 of them late. I'm wondering if this new rule must be preserved as the review can be accepted without the commiting reviewers review.
Trying to use committed reviewers was a new thing for this review (it was proposed by others in http://tinyurl.com/48tdjs). Now that it has been tried out, it would be good to reflect on how things went. For the reviews that were late, the reviewers were very diligently contacting me about their changing time constraints and offered times that would work for them, and I found those times acceptable. Out of the two committed reviewers that did not submit, one also diligently worked with me on setting up an alternative timeline, but in the end contacted me saying that a full review would not be possible after all, and offered a brief report of looking into the library and a brief rationale for acceptance (as the points matched those of other reviewers, I didn't forward it to the list). I was not able to contact the final reviewer after a certain point. I think the major benefit of the committed reviewers is the high probability that they actually will submit a review. Things always happen, and boost contributions are all volunteer time, but I found that overall the committed reviewers *very responsibly* succeeded in keeping their commitment as well as they could. Were they not committed reviewers, I suspect many of them would have considered giving up. In addition, the extensions that the committed reviewers negotiated gave other reviewers a chance to contribute. So, I feel that committed reviews are a good way of reasonably making sure that a certain number of reviews will be submitted. I'm not sure that the library acceptance / rejection should be influenced by the number of committed reviews (reviews from others are a perfectly good substitute). While we're on this topic... The one thing that the number of committed reviewers should definitely influence is whether the review happens in the first place. We were fortunate to get 5 people sign up, which was the only threshold proposed, so we followed through with the review. I was wondering what to do if the threshold wasn't crossed, and I think what I would have proposed is some sort of a holding pattern, where the review schedule page is updated to say something like "signals2 - reviewers needed, contact review manager to sign up as a reviewer". Once the threshold is crossed, the reviewers, manager and author can schedule a time. Another thing... many of the committed reviewers reported that they were first time reviewers (and the reviews they submitted were impressively detailed and very valuable). Personally, I was thrilled by this, since getting new participation in boost is critical to sustaining its quality . Perhaps some combination of allowing reviewers to commit beforehand, and explicitly encouraging reviews focused on the user perspective (another suggestion from the http://tinyurl.com/48tdjs thread), helped in getting first-time reviewers (perhaps the reviewers can comment on this?).
* There were some reviews that came into this list by the intermediation of the review manager with a delay between the posting of the reviwer and forward from the RM. One negative review posted the 4th and reveived in this list the 11th other positive posted the 2nd and received in this list 3rd. I think that the review manager should not encourage the reviewers to send the reviews to himself. This will avoid this kind of delays. So I purpose that only the reviews sent to this single mailing list must be taken in account.
Yes, this particular delay is entirely my fault. I can't explain why I didn't notice this email until so late (which is the point where I contacted the poster notifying him I would like to forward the mail, and forwarded it the following morning). Allowing reviews to be sent to the review manager is straight from http://www.boost.org/community/reviews.html (Introduction paragraph). There are two valid reasons for this, IMO: * the reviewer is only subscribed to one of {boost, boost-user, boost-announce} lists, and would like the reviewer to be forwarded to both boost and boost-user lists (if there was a dedicated review list, like you suggest, that was also open to all posters, then this reason would go away) * the reviewer would like to remain anonymous to the list (granted, this could also be accomplished by sending from an anonymous e-mail). If this stays as it is, the RM should be more diligent about monitoring her personal mail, which I apparently wasn't.
* Even if the review was over the 10th there were 2 accepting reviews comming from the commiting reviewers just some hours before the review result annoncement 19th. I think that the review manager must state clearly when the review is over and do not accept any review after. This do not means that the RMcannot change this date, but annnce it clearly.
I tried to keep my timeline as transparent as possible. In my review-closing email, I asked people to let me know if they were considering writing a review. Those that contacted me were informed of the timeline of pending reviews. When the last promised review was submitted, I set a hard deadline. Given the uncertainty in people's schedules, I decided to approach this in a flexible way. That was just my personal preference - I'm sure setting a hard deadline earlier would have been a fine choice as well.
I hope this will help to improve future reviews,
Thanks again for starting this discussion and for your suggestions - I also hope good improvements will come out of it. Stjepan

There are a number of things in here I want to reply to. However, I want to preface my reply by saying these are my personal opinions, they are not statements of policy from a Review Wizard. John Stjepan Rajko wrote:
On Thu, Nov 20, 2008 at 11:40 PM, vicente.botet
wrote: I would like to make some suggestion to improve the review management:
Thank you for starting this discussion.
Thanks to both of you for starting it and bringing it out where more people will notice.
* I had notice only at the end of the the review that the review tokes place on two mailing lists (devel ans user). Maybe this is usaual on Boost reviews but I was not aware. It will be more transaparent if all the reviews were posted to the same mailing list, maybe a specific one should be created.
[snip]
Yes, it does. The problems Stjepan points out with a single location are largely the motivation for the multi-list discussion. Realistically, there are people who only have time or interest in one of the two lists, and will not sign up for both. Even worse would be a list that only exists for reviews, that would quickly become a ghost town. There are multiple populations of interest in Boost. There are those who are interested in hashing out all of the development details and those who are more interested in using the libraries without needing to discuss the details. Both have important perspectives to offer to a review, and both should be encouraged to participate. One way to do this is to keep the barrier for such participation as low as is reasonably possible. It means that the library author and the review manager are committing to watching both lists (as well as anyone who wants to see all parts of the conversation), but that is putting the extra effort on the people most dedicated to doing it and is a better choice.
* From the 5 committing reviewers making this review possible only 3 made a review and 2 of them late. I'm wondering if this new rule must be preserved as the review can be accepted without the commiting reviewers review.
[snip]
So, I feel that committed reviews are a good way of reasonably making sure that a certain number of reviews will be submitted. I'm not sure that the library acceptance / rejection should be influenced by the number of committed reviews (reviews from others are a perfectly good substitute).
[snip]
I feel pretty strongly that the committed reviewers not getting a chance to submit should not disqualify the library. Nor should the reviews coming in late.
Another thing... many of the committed reviewers reported that they were first time reviewers (and the reviews they submitted were impressively detailed and very valuable). Personally, I was thrilled by this, since getting new participation in boost is critical to sustaining its quality . Perhaps some combination of allowing reviewers to commit beforehand, and explicitly encouraging reviews focused on the user perspective (another suggestion from the http://tinyurl.com/48tdjs thread), helped in getting first-time reviewers (perhaps the reviewers can comment on this?).
I'm happy about it, as well.
* There were some reviews that came into this list by the intermediation of the review manager with a delay between the posting of the reviwer and forward from the RM. One negative review posted the 4th and reveived in this list the 11th other positive posted the 2nd and received in this list 3rd. I think that the review manager should not encourage the reviewers to send the reviews to himself. This will avoid this kind of delays. So I purpose that only the reviews sent to this single mailing list must be taken in account.
Yes, this particular delay is entirely my fault. I can't explain why I didn't notice this email until so late (which is the point where I contacted the poster notifying him I would like to forward the mail, and forwarded it the following morning).
Allowing reviews to be sent to the review manager is straight from http://www.boost.org/community/reviews.html (Introduction paragraph). There are two valid reasons for this, IMO:
* the reviewer is only subscribed to one of {boost, boost-user, boost-announce} lists, and would like the reviewer to be forwarded to both boost and boost-user lists (if there was a dedicated review list, like you suggest, that was also open to all posters, then this reason would go away)
Sometimes reviews even come from people who aren't subscribed to any of the lists (This is not common, but it has happened.) If they have a thoughtful and productive review to offer, they should be encouraged, not excluded.
* the reviewer would like to remain anonymous to the list (granted, this could also be accomplished by sending from an anonymous e-mail).
If this stays as it is, the RM should be more diligent about monitoring her personal mail, which I apparently wasn't.
* Even if the review was over the 10th there were 2 accepting reviews comming from the commiting reviewers just some hours before the review result annoncement 19th. I think that the review manager must state clearly when the review is over and do not accept any review after. This do not means that the RMcannot change this date, but annnce it clearly.
I tried to keep my timeline as transparent as possible. In my review-closing email, I asked people to let me know if they were considering writing a review. Those that contacted me were informed of the timeline of pending reviews. When the last promised review was submitted, I set a hard deadline. Given the uncertainty in people's schedules, I decided to approach this in a flexible way. That was just my personal preference - I'm sure setting a hard deadline earlier would have been a fine choice as well.
Let's imagine for a moment what a hard cutoff with everything after that ignored would mean. What if, the day after the review period closed, someone submitted a late review that showed conclusively that the library was not usable? (Maybe it has unacceptable side effects that no one else noticed, or maybe it infringes on a patent that none of the rest of us knew about. Whatever reason, it can't be allowed to pass.) Even if every review was positive and suggested acceptance during the review period, this should fail. Drawing a line and saying "nothing after this date" would make this scenario possible. In fact, asking for such a hard cutoff shows one misunderstanding about the current process. As the process currently exists, it is not democratic. It is a benign and hopefully well informed autocracy. The Review Manager has the authority to over rule the plurality of reviewers. The goal of the reviews is not to collect votes, but instead to provide the best possible information for the manager. We select managers with the intention that they will make good decisions given this information. (I know of no circumstance where a manager has been overruled by the Wizards or the Moderators, but I do follow the discussion on every review to make sure that I think the decision reached by the manager is reasonable. If I found something that lead me to believe the decision was unreasonable, I would start a discussion with the other Wizard and the Moderators about what to do. On the one occasion where a manager did not fulfill the obligation to decide, I had to step in and finish the review.) Since the goal of the review process is to inform the manager as well as possible, cutting off the discussion arbitrarily is counter productive. In fact, in the reviews I have managed I have intentionally looked at posts from both before and after the official review period for extra information. The only cutoff that can't be avoided is that I try to submit the review results within about a week or so of the end of the period. That obviously cuts off what I can consider.
I hope this will help to improve future reviews,
Thanks again for starting this discussion and for your suggestions - I also hope good improvements will come out of it.
Stjepan
Thanks to both of you. Considering how to do the reviews better is always important. John

on Fri Nov 21 2008, "Stjepan Rajko"
On Thu, Nov 20, 2008 at 11:40 PM, vicente.botet
wrote: I would like to make some suggestion to improve the review management:
Thank you for starting this discussion.
* I had notice only at the end of the the review that the review tokes place on two mailing lists (devel ans user). Maybe this is usaual on Boost reviews but I was not aware. It will be more transaparent if all the reviews were posted to the same mailing list, maybe a specific one should be created.
Yes, this is a bit of an issue. I mentioned that both lists are used in my post that opened the review (in my notes to first-time reviewers) to try to give people a heads-up (but... I often get long-winded and I'm sure it's easy to miss parts of my posts :-)).
Having a dedicated, or at least recommended mailing list (either dev or user) might be a good thing.
It has always been the -devel list, at least up until recently. How reviews began being posted to -users, I do not know. -- Dave Abrahams BoostPro Computing http://www.boostpro.com
participants (3)
-
David Abrahams
-
John Phillips
-
Stjepan Rajko