
If we're going to get that, mightn't we also get links for the things that unexpectedly pass (dark green). With so many people coding by experiment, it's sometimes far to easy to accidently put code in a program that compiles on "my compiler" but not on others. Seeing the tests that _should_ fail would help, IMO At Saturday 2004-07-31 06:09, you wrote:
Le sam 31/07/2004 à 14:05, Aleksey Gurtovoy a écrit :
Jeff Garland writes:
Looking at the boost-wide reports today, testmicrosec_time_clock is marked as failing red on a whole series of compilers. However, they should be 'green' (and have been in past reports) as they have been explicitly marked as failing:
http://www.meta-comm.com/engineering/boost-regression/developer/date_time.ht...
Seems like there is something amiss in the results reporting...
Fixed now, thanks for pointing this out!
If I understand correctly, failures explicitly marked will be reported in green cells. I'm not sure I like this solution. I would rather the color to be gray. The gray color means "The library author marked it as unusable on particular platform/toolset". It is currently used only for mark-unusable tests; but I think this description equally applies to a mark-failure test, doesn't it? Or another color could be chosen. But having a test suddenly green when a library author becomes aware of the issue is a bit excessive imho.
Regards,
Guillaume
_______________________________________________ Unsubscribe & other changes: http://lists.boost.org/mailman/listinfo.cgi/boost
Victor A. Wagner Jr. http://rudbek.com The five most dangerous words in the English language: "There oughta be a law"