
David Abrahams wrote:
* See all tests, as part of the test database structure (i.e. their organization into test suites)
* See meta data associated with tests, such as
- what kind of test - expected outcome, per platform - dependencies, prerequisites, etc.
How would that help with robustness?
There have been a number of cases where the regression harness picked up (and reported) stale results, just because old executables / results were lying around in the build tree. It didn't 'know' that the test actually had been removed from the test suite. But even in a somewhat broader sense, being able to inspect the test database with all its metadata IMO contributes to robustness. Robustness through transparency... Regards, Stefan -- ...ich hab' noch einen Koffer in Berlin...