
Hi, According to the regression summary, the new numeric conversion library is in pretty good shape(thanks to Joaquín Muñoz for his latest fixes!) I want to test how far can the new "numeric_cast<>" replace the old one. For that, I want to put the exact same test that's currently used for the old numeric_cast in the new library. Right now, cast_test.cpp contains numeric_cast<> test with other conversions. I will factor it out, but I have a doubt: it is not using the Test framework at all, it's just printing the test result to std out. Does the regression scripts parse the output, or these test are just wrongly integrated? Should I update those test to use the Unit Test Framework? Fernando Cacciola SciSoft

"Fernando Cacciola" <fernando_cacciola@hotmail.com> wrote in message news:cjcfto$56n$1@sea.gmane.org...
Hi,
According to the regression summary, the new numeric conversion library is in pretty good shape(thanks to Joaquín Muñoz for his latest fixes!)
I want to test how far can the new "numeric_cast<>" replace the old one. For that, I want to put the exact same test that's currently used for the old numeric_cast in the new library. Right now, cast_test.cpp contains numeric_cast<> test with other conversions. I will factor it out, but I have a doubt: it is not using the Test framework at all, it's just printing the test result to std out.
Does the regression scripts parse the output, or these test are just wrongly integrated?
Should I update those test to use the Unit Test Framework?
Fernando Cacciola SciSoft
It basically your choice, but why wouldn't you. I keep repeating during formal reviews that test programs could be enhanced here and there would you use Boost.Test (and all it's tools). If you need any help feel free to ask. Gennadiy.

"Fernando Cacciola" <fernando_cacciola@hotmail.com> wrote in message news:cjcfto$56n$1@sea.gmane.org...
Hi,
According to the regression summary, the new numeric conversion library is in pretty good shape(thanks to Joaquín Muñoz for his latest fixes!)
I want to test how far can the new "numeric_cast<>" replace the old one. For that, I want to put the exact same test that's currently used for
old numeric_cast in the new library. Right now, cast_test.cpp contains numeric_cast<> test with other conversions. I will factor it out, but I have a doubt: it is not using
"Gennadiy Rozental" <gennadiy.rozental@thomson.com> escribió en el mensaje news:cjcj36$e9g$1@sea.gmane.org... the the
Test framework at all, it's just printing the test result to std out.
Does the regression scripts parse the output, or these test are just wrongly integrated?
Should I update those test to use the Unit Test Framework?
Fernando Cacciola SciSoft
It basically your choice, but why wouldn't you. I keep repeating during formal reviews that test programs could be enhanced here and there would you use Boost.Test (and all it's tools).
I'd rather change it to use at least the minimal test framework, otherwise I don't see how these results are interpreted. But then I don't know if there is someone expecting the particular format of this result and parsing it. If that's the case, my "fix" will we interpreted as a failure. Fernando Cacciola SciSoft

It basically your choice, but why wouldn't you. I keep repeating during formal reviews that test programs could be enhanced here and there would you use Boost.Test (and all it's tools).
I'd rather change it to use at least the minimal test framework, otherwise I don't see how these results are interpreted. But then I don't know if there is someone expecting the particular format of this result and parsing it. If that's the case, my "fix" will we interpreted as a failure.
Fernando Cacciola SciSoft
Boost.Test is safe bet in this regard. Test program output will definitely be accepted and processed properly but Boost.Build. Gennadiy.

It basically your choice, but why wouldn't you. I keep repeating during formal reviews that test programs could be enhanced here and there would you use Boost.Test (and all it's tools).
I'd rather change it to use at least the minimal test framework, otherwise I don't see how these results are interpreted. But then I don't know if there is someone expecting the particular
"Gennadiy Rozental" <gennadiy.rozental@thomson.com> escribió en el mensaje news:cjckub$eel$1@sea.gmane.org... format
of
this result and parsing it. If that's the case, my "fix" will we interpreted as a failure.
Fernando Cacciola SciSoft
Boost.Test is safe bet in this regard. Test program output will definitely be accepted and processed properly but Boost.Build.
Gennadiy.
OK... I'll change it then Thanks Fernando Cacciola SciSoft

Fernando Cacciola writes:
According to the regression summary, the new numeric conversion library is in pretty good shape(thanks to Joaquín Muñoz for his latest fixes!)
I want to test how far can the new "numeric_cast<>" replace the old one.
Fernando, The newly checked-in tests significantly disturbed the regressions picture -- http://www.meta-comm.com/engineering/boost-regression/developer/summary_rele.... A number of people are working hard clearing up all these red and yellow cells so that we can branch for release, and any new failures, even easy-to-fix ones, are not helping. FYI, I had to correct a number of trivial errors in both tests you've added simply in order to make them compile. Please try to exercise more caution next time you check things in. Thanks, -- Aleksey Gurtovoy MetaCommunications Engineering

"Aleksey Gurtovoy" <agurtovoy@meta-comm.com> escribió en el mensaje news:cje93g$vh1$1@sea.gmane.org...
Fernando Cacciola writes:
According to the regression summary, the new numeric conversion library is in pretty good shape(thanks to Joaquín Muñoz for his latest fixes!)
I want to test how far can the new "numeric_cast<>" replace the old one.
Fernando,
The newly checked-in tests significantly disturbed the regressions picture --
http://www.meta-comm.com/engineering/boost-regression/developer/summary_rele....
A number of people are working hard clearing up all these red and yellow cells so that we can branch for release, and any new failures, even easy-to-fix ones, are not helping.
FYI, I had to correct a number of trivial errors in both tests you've added simply in order to make them compile. Please try to exercise more caution next time you check things in.
Grrr, I run the tests locally before commiting but against another copy. I just looked that there were all **passed** and missed the fact that the new test was not there. But then I commit the right copy. Anyway these are just excuses. I'll pay more attention to what I'm doing next time. Really sorry BTW: I've updated to get your fixes and run the test locally (correctly this time), and it passes now. So I'll just leave it as that. Fernando Cacciola SciSoft
participants (3)
-
Aleksey Gurtovoy
-
Fernando Cacciola
-
Gennadiy Rozental