[safe_numerics] Last three days
Hi Everyone, We have three last days remaining for safe_numerics review (till March 11th). If you like working under time pressure, this is the perfect moment for doing your review. If you would like to submit your review, but feel you will not make it till March 11th, let me know, we can try to extend the review period. Currently, I recorder three reviews with a yes/no call, from Paul A. Bristow and Steven Watanabe, and one without a yes/no call from Vicente J. Botet Escriba. If you have submitted a review, but were not mentioned in my list, please let me know: it means I must have missed it. ----- safe_numerics is a set of drop-in replacements for the built-in integer types, which make the guarantee that the results of operations are either correct according to the rules of integer arithmetic or report a diagnosable error (rather than offering modulo arithmetic results, or resulting in undefined behavior, or returning incorrect results as in the case of signed vs. unsigned comparisons). The library can be found on GitHub: https://github.com/robertramey/safe_numerics/ Online docs: http://htmlpreview.github.io/?https://github.com/robertramey /safe_numerics/master/doc/html/index.html Formal Review comments can be posted publicly on the mailing list or Boost Library Incubator http://blincubator.com, or sent privately to the review manager, that is myself. Here are some questions you might want to answer in your review: - What is your evaluation of the design? - What is your evaluation of the implementation? - What is your evaluation of the documentation? - What is your evaluation of the potential usefulness of the library? - Did you try to use the library? With what compiler? Did you have any problems? - How much effort did you put into your evaluation? A glance? A quick reading? In-depth study? - Are you knowledgeable about the problem domain? And most importantly: Do you think the library should be accepted as a Boost library? For more information about Boost Formal Review Process, see: http://www.boost.org/community/reviews.html Regards, &rzej;
-----Original Message----- From: Boost [mailto:boost-bounces@lists.boost.org] On Behalf Of Andrzej Krzemienski via Boost Sent: 10 March 2017 08:25 To: boost@lists.boost.org Cc: Andrzej Krzemienski Subject: [boost] [safe_numerics] Last three days
Hi Everyone, We have three last days remaining for safe_numerics review (till March 11th). If you like working under time pressure, this is the perfect moment for doing your review.
If you would like to submit your review, but feel you will not make it till March 11th, let me know, we can try to extend the review period.
FWIW, I feel that this is a complex library and people clearly need more time to consider it. So I favour extending the review period. Paul --- Paul A. Bristow Prizet Farmhouse Kendal UK LA8 8AB +44 (0) 1539 561830
Hi Everyone, We have three last days remaining for safe_numerics review (till March 11th). If you like working under time pressure, this is the perfect moment for doing your review.
If you would like to submit your review, but feel you will not make it till March 11th, let me know, we can try to extend the review period.
FWIW, I feel that this is a complex library and people clearly need more time to consider it.
For me personally, this is a library really far from my experience. I've never used anything like it before. It looks fine, the design looks good, implementation the same and the docs reasonable. But I'm also aware there are alternatives out there in the open source and I don't know how this compares to those. I also know proposals about this domain have gone before WG21 in the past, and I don't know how those compare to this either. This is why I, and I suspect many others, haven't given a review nor intend to. I have such low confidence in my opinion of this library that I feel it worthless to say anything about it all other than "it looks okay". Hence extending the review period I suspect will make little difference. Niall -- ned Productions Limited Consulting http://www.nedproductions.biz/ http://ie.linkedin.com/in/nialldouglas/
2017-03-10 11:00 GMT+01:00 Niall Douglas via Boost
Hi Everyone, We have three last days remaining for safe_numerics review (till March 11th). If you like working under time pressure, this is the perfect moment for doing your review.
If you would like to submit your review, but feel you will not make it till March 11th, let me know, we can try to extend the review period.
FWIW, I feel that this is a complex library and people clearly need more time to consider it.
For me personally, this is a library really far from my experience. I've never used anything like it before. It looks fine, the design looks good, implementation the same and the docs reasonable. But I'm also aware there are alternatives out there in the open source and I don't know how this compares to those. I also know proposals about this domain have gone before WG21 in the past, and I don't know how those compare to this either.
This is why I, and I suspect many others, haven't given a review nor intend to. I have such low confidence in my opinion of this library that I feel it worthless to say anything about it all other than "it looks okay".
Hence extending the review period I suspect will make little difference.
We'll see if anyone requests for more time. Regards, &rzej;
On 10/03/2017 08:24, Andrzej Krzemienski via Boost wrote:
Hi Everyone, We have three last days remaining for safe_numerics review (till March 11th). If you like working under time pressure, this is the perfect moment for doing your review.
If you would like to submit your review, but feel you will not make it till March 11th, let me know, we can try to extend the review period.
I'm going to try and write a review, but as well as being short on time, I'm hitting a lot of issues, most notably at present that almost all the tests include cxxabi.h which is a GCC libstdc++ internal header, is there a reason for this? John. --- This email has been checked for viruses by AVG. http://www.avg.com
On 3/10/17 5:16 AM, John Maddock via Boost wrote:
On 10/03/2017 08:24, Andrzej Krzemienski via Boost wrote:
Hi Everyone, We have three last days remaining for safe_numerics review (till March 11th). If you like working under time pressure, this is the perfect moment for doing your review.
If you would like to submit your review, but feel you will not make it till March 11th, let me know, we can try to extend the review period.
I'm going to try and write a review, but as well as being short on time, I'm hitting a lot of issues, most notably at present that almost all the tests include cxxabi.h which is a GCC libstdc++ internal header, is there a reason for this?
Hmm - I just looked into this. Most of them are superfluous. I built and tested with CLang and GCC. I wasn't able to get appveyor to work to test on VC. That's why I didn't pick this up sooner.
John.
--- This email has been checked for viruses by AVG. http://www.avg.com
_______________________________________________ Unsubscribe & other changes: http://lists.boost.org/mailman/listinfo.cgi/boost
2017-03-10 16:00 GMT+01:00 Robert Ramey via Boost
On 3/10/17 5:16 AM, John Maddock via Boost wrote:
On 10/03/2017 08:24, Andrzej Krzemienski via Boost wrote:
Hi Everyone, We have three last days remaining for safe_numerics review (till March 11th). If you like working under time pressure, this is the perfect moment for doing your review.
If you would like to submit your review, but feel you will not make it till March 11th, let me know, we can try to extend the review period.
I'm going to try and write a review, but as well as being short on time, I'm hitting a lot of issues, most notably at present that almost all the tests include cxxabi.h which is a GCC libstdc++ internal header, is there a reason for this?
Hmm - I just looked into this. Most of them are superfluous. I built and tested with CLang and GCC. I wasn't able to get appveyor to work to test on VC. That's why I didn't pick this up sooner.
Is it possible to apply a quick fix, to unblock John? Regards, &rzej;
On 3/10/17 12:24 AM, Andrzej Krzemienski via Boost wrote:
Hi Everyone, We have three last days remaining for safe_numerics review (till March 11th). If you like working under time pressure, this is the perfect moment for doing your review.
If you would like to submit your review, but feel you will not make it till March 11th, let me know, we can try to extend the review period.
Currently, I recorder three reviews with a yes/no call, from Paul A. Bristow and Steven Watanabe, and one without a yes/no call from Vicente J. Botet Escriba. If you have submitted a review, but were not mentioned in my list, please let me know: it means I must have missed it.
The review process has been extremely helpful to me. Besides the multitude of dumb errors, documentation oversights, spelling errors, etc. It's smoked out a number of really fundamental errors and issues. Of course this is discouraging. But none of them are too difficult to fix - though a little time consuming. Summary of Review so Far Steve Watanabee went over the whole damn thing with a fine tooth comb and generated a long list of stuff to fix and address. I seriously doubt anyone will surpass that in level of detail and understanding. Still he recommended acceptance subject to a long list of fixes, corrections, clarifications etc. being addressed. I'm in agreement with all of them. I've been working on these. Vicente - brought up a number of issues related to the library design, concept, and purpose. In particular the difference of approach between this library and other proposals presented to the C++ committee. These approaches aren't really reconcilable. But there's always more than one way to skin the C++ cat (note: This is a well worn saying not meant to be taken literally. Specifically I'm not referring to the actual CPP cat described here https://twitter.com/CppCon/status/779074829303504896 and https://www.facebook.com/pg/CppConference/photos/?tab=album&album_id=411120545751971 ) This is an important, but under appreciated topic. It's totally OK for different views to be implemented. In fact, it's actually a necessity. Note that there is a GSOC project mentored by boost which implements the proposals before the committee. I'm happy with letting Darwin's theory sort it all out. Paul's review was very cursory. But it's clear that I've been able to communicate the idea of the library, how it is meant to be used and it's potential for addressing real world problems. I'm still waiting on John Maddock's review. Now that I know that from Steven's review that there are a number of serious bugs and oversights, I'm sort of embarassed to have it reviewed. But I'm sure he will have something valuable to contribute. I don't know that we'll get many more reviews on this. Apparently the topic is kind of a turn off. (It is actually if you think about it) Naill expressed this well. So I would suggest that the review period be extended just through this weekend to monday. The review manager on his own initiative can just state that the review is officially closed but he's happy to receive reviews through monday. I don't think that this would be a big problem for anyone. I'm very, very concerned that there are only a very few reviews (actually really just one !!!). In the past I've railed against the acceptance of libraries with only two reviews !!! I don't really know what else to say about this. I'll just punt to the review manager. I'm gratified that the review hasn't pivoted off into space with huge discussions about names (aka bike shedding), library re-design, and other mostly distracting and irrelevant topics. Except for having my personal blunders pointed out in a public forum, it's been a pretty pleasant an enlighting experience. It's things like this which make me love Boost and give me hope for the future of C++ and our craft in general - in spite of massive amounts of bad quality code and products. Robert Ramey
I'm very, very concerned that there are only a very few reviews (actually really just one !!!). In the past I've railed against the acceptance of libraries with only two reviews !!! I don't really know what else to say about this. I'll just punt to the review manager.
I think the problem is this: normally we review largely based on interface and the design - get the design right and the internals usually take care of themselves. However, in this case the design is (hopefully) exceptionally uncontroversial - it looks like an int, smalls like an int, and behaves like an int. There really isn't much to get your teeth into there. What really matters is that: * It's functionally correct. * It truly is a drop in replacement for type int, with no nasty surprises. * It's performance compared to int isn't so dreadful that no one uses it. Unfortunately reviewing these points requires some exceptionally detailed work: the internals of the library are sufficiently complex, and use enough unfamiliar (to me at least) C++14 features, that this is not an easy task. I confess at present to be deeply surprised at how complex the internals are... Best, John. --- This email has been checked for viruses by AVG. http://www.avg.com
On 3/10/17 9:22 AM, John Maddock via Boost wrote:
I'm very, very concerned that there are only a very few reviews (actually really just one !!!). In the past I've railed against the acceptance of libraries with only two reviews !!! I don't really know what else to say about this. I'll just punt to the review manager.
I think the problem is this: normally we review largely based on interface and the design - get the design right and the internals usually take care of themselves. However, in this case the design is (hopefully) exceptionally uncontroversial - it looks like an int, smalls like an int, and behaves like an int. There really isn't much to get your teeth into there. What really matters is that:
* It's functionally correct. * It truly is a drop in replacement for type int, with no nasty surprises. * It's performance compared to int isn't so dreadful that no one uses it.
Unfortunately reviewing these points requires some exceptionally detailed work: the internals of the library are sufficiently complex, and use enough unfamiliar (to me at least) C++14 features, that this is not an easy task.
This is a very believable explanation.
I confess at present to be deeply surprised at how complex the internals are...
If it makes any difference - it started out a lot simpler. Then it became apparent that the issue of performance could not be ignored in the real world. This meant detecting and filtering out redundant checking. On which I was stuck - until C++14 generalized constexpr which permitted implementation of compile time integer arithmetic. (which actually could/should be a separate library. This in turned motivated the checked integer routines to be constexpr. My interest and concerns for embedded systems and compile time usage of checked integer operations introduced constexpr check_result - basically a kind of monad one can use a compile time. All in all it's the composition of several libraries each of which could/should be a separate boost library. check_result, checked integer arithmetic, integer interval arithmetic. At the top is a layer which defines a safe_integer type and uses enable_if to overload all the binary and unary operations involving safe integers. The implementation of this last would be more concise using concepts-lite in C++20 or Paul Fultz's tick library. But neither of these are in boost or yet in the standard. All of the above is implemented via constexpr where possible. So as one can see it's actually pretty simple. It's only now that I've "finished" it, that I see how simple it is. Steven's exhaustive line by line review of the code is going to be a very tough act follow. This appeared the second day of the review. Maybe that intimated people - it would me. He's pointed out errors which I've agreed to fix. so it's not clear that repeating this process, though it woudn't hurt, might be somewhat redundant. Perhaps you might take a different tack. I've spent a little time looking at the boost multi-precision library with the eye of incorporating it into the safe numerics testing. In the course of doing this a couple of interesting things occurred to me. a) would safe numeric types inter-operate with safe numerics types? They should - but I haven't actually tested this. Whenever I fail to test something - there's almost always a bug in it. b) would safe<T> work if T is one of the types defined by boost multi-precision? This is unclear to me. safe numerics presumes that the largest types available are std::uintmax_t and std::intmax_t . It's easy to imagine altering this presumption to use types defined by the user to be types like boost::uint512_t or... . I think this would work with minor changes - such a combination would open up whole new territory. Maybe if you confined the scope of your review to issues such as the above you could save a lot of time and bring up issues that others are not in a position to do. Finally, my biggest disappointment in all this has been my inability to get people to take this whole effort seriously. That is, to even admit that there is even a problem. It's inexplicable and disheartening to me that one can write something like x + y and not be confident that the result actually equals x + y. Especially in light of the fact that we're using C++ to make flying cars - not just websites. I feel that for the first time in 60 years we're in the position of making demonstrably correct software - and no one cares. It's very frustrating. I so much appreciate the interaction I have with my Boost soulmates. I'm not sure what I would do with this. Robert Ramey
Best, John.
--- This email has been checked for viruses by AVG. http://www.avg.com
_______________________________________________ Unsubscribe & other changes: http://lists.boost.org/mailman/listinfo.cgi/boost
Update: I now have a version which builds and runs all the tests with VC2017, the fixes required are here: https://github.com/robertramey/safe_numerics/pull/33 Hopefully this will encourage a few other people to give this a try. The one exception is test_cast.cpp which triggers the MSVC runtime checks due to the use of an unitialized variable in the test suite, which should obviously be fixed. I also noticed that test_z.cpp has no content: it's all #if'ed out.
Perhaps you might take a different tack. I've spent a little time looking at the boost multi-precision library with the eye of incorporating it into the safe numerics testing. In the course of doing this a couple of interesting things occurred to me.
a) would safe numeric types inter-operate with safe numerics types? They should - but I haven't actually tested this. Whenever I fail to test something - there's almost always a bug in it.
b) would safe<T> work if T is one of the types defined by boost multi-precision? This is unclear to me. safe numerics presumes that the largest types available are std::uintmax_t and std::intmax_t . It's easy to imagine altering this presumption to use types defined by the user to be types like boost::uint512_t or... . I think this would work with minor changes - such a combination would open up whole new territory.
There is overlap between the two libraries here: class cpp_int already has built-in overflow checking that allows it to operate as a checked integer type already. The issue here, is that a generic test for possible overflow is likely to be ruinously expensive compared to built-in support - in fact built in support has basically zero cost since the presence of overflow is already right there in the multiprecision-algorithm, as in "we have more bits in the result but no where left to put them". One thing I want to at least try and do, is figure out whether your extension-API's are suitable for hooking into this along with assembly code overflow checks for native types.
Maybe if you confined the scope of your review to issues such as the above you could save a lot of time and bring up issues that others are not in a position to do.
Finally, my biggest disappointment in all this has been my inability to get people to take this whole effort seriously. That is, to even admit that there is even a problem. It's inexplicable and disheartening to me that one can write something like x + y and not be confident that the result actually equals x + y. Especially in light of the fact that we're using C++ to make flying cars - not just websites. I feel that for the first time in 60 years we're in the position of making demonstrably correct software - and no one cares. It's very frustrating.
I so much appreciate the interaction I have with my Boost soulmates. I'm not sure what I would do with this.
Build it and they will come ;) Best, John. --- This email has been checked for viruses by AVG. http://www.avg.com
On 3/10/17 11:25 AM, John Maddock via Boost wrote:
Update: I now have a version which builds and runs all the tests with VC2017, the fixes required are here: https://github.com/robertramey/safe_numerics/pull/33
Hopefully this will encourage a few other people to give this a try.
The one exception is test_cast.cpp which triggers the MSVC runtime checks due to the use of an unitialized variable in the test suite, which should obviously be fixed.
I also noticed that test_z.cpp has no content: it's all #if'ed out.
I use test_z as place to put temporary code used to track down specific errors. Doing it this way means that I don't have to constantly re-cake my ide project (which always ends up a minor project. sorry CMake fan - CMake doesn't really work - it has to be made to work). Some times these mini-projects get promoted into their own test. Other times I just comment them out so I don't run them - but I'm loath to just throw them a way.
There is overlap between the two libraries here: class cpp_int already has built-in overflow checking that allows it to operate as a checked integer type already. The issue here, is that a generic test for possible overflow is likely to be ruinously expensive compared to built-in support - in fact built in support has basically zero cost since the presence of overflow is already right there in the multiprecision-algorithm, as in "we have more bits in the result but no where left to put them". One thing I want to at least try and do, is figure out whether your extension-API's are suitable for hooking into this along with assembly code overflow checks for native types.
My interest has been peeked by my current efforts to fixup automatic type promotion. Currently it seems that many compilers support 128 bit integer operations - but std::uintmax_t tops out at 64. maybe multiprecision specializes the 128 bit integers to use the compiler built-ins. Once I pull on that thread - it's hard to avoid just permitting all the operations 256, 512 bits etc. Even if they are used only a compile time to figure out when value checking can be skipped. Of course this pre-supposes that multi-precision is implemented in a constexpr way. ... I'm not in a hurry about this though. Things are already complicated enough considering the libraries separately.
Build it and they will come ;)
LOL - that's not the way I see the world. My view is the one has to flog the hell out of even the simplest obvious ideas to make them happen. Robert Ramey
AMDG On 03/10/2017 12:25 PM, John Maddock via Boost wrote:
Update: I now have a version which builds and runs all the tests with VC2017, the fixes required are here: https://github.com/robertramey/safe_numerics/pull/33
It's a bit more complex than it needs to be: - msvc_max isn't needed. msvc accepts max(x, y). - boost::core::demangle does all the #ifdef'ing for you. In Christ, Steven Watanabe
On 10/03/2017 20:32, Steven Watanabe via Boost wrote:
AMDG
On 03/10/2017 12:25 PM, John Maddock via Boost wrote:
Update: I now have a version which builds and runs all the tests with VC2017, the fixes required are here: https://github.com/robertramey/safe_numerics/pull/33
It's a bit more complex than it needs to be: - msvc_max isn't needed. msvc accepts max(x, y).
Not quite: that was my first fix, but many tests failed with internal compiler errors inside std::max :(
- boost::core::demangle does all the #ifdef'ing for you.
I wish I'd known that! Thanks, John. --- This email has been checked for viruses by AVG. http://www.avg.com
2017-03-10 19:56 GMT+01:00 Robert Ramey via Boost
On 3/10/17 9:22 AM, John Maddock via Boost wrote:
I'm very, very concerned that there are only a very few reviews
(actually really just one !!!). In the past I've railed against the acceptance of libraries with only two reviews !!! I don't really know what else to say about this. I'll just punt to the review manager.
I think the problem is this: normally we review largely based on interface and the design - get the design right and the internals usually take care of themselves. However, in this case the design is (hopefully) exceptionally uncontroversial - it looks like an int, smalls like an int, and behaves like an int. There really isn't much to get your teeth into there. What really matters is that:
* It's functionally correct. * It truly is a drop in replacement for type int, with no nasty surprises. * It's performance compared to int isn't so dreadful that no one uses it.
Unfortunately reviewing these points requires some exceptionally detailed work: the internals of the library are sufficiently complex, and use enough unfamiliar (to me at least) C++14 features, that this is not an easy task.
This is a very believable explanation.
I confess at present to be deeply surprised at how
complex the internals are...
If it makes any difference - it started out a lot simpler. Then it became apparent that the issue of performance could not be ignored in the real world. This meant detecting and filtering out redundant checking. On which I was stuck - until C++14 generalized constexpr which permitted implementation of compile time integer arithmetic. (which actually could/should be a separate library. This in turned motivated the checked integer routines to be constexpr. My interest and concerns for embedded systems and compile time usage of checked integer operations introduced constexpr check_result - basically a kind of monad one can use a compile time. All in all it's the composition of several libraries each of which could/should be a separate boost library. check_result, checked integer arithmetic, integer interval arithmetic. At the top is a layer which defines a safe_integer type and uses enable_if to overload all the binary and unary operations involving safe integers. The implementation of this last would be more concise using concepts-lite in C++20 or Paul Fultz's tick library. But neither of these are in boost or yet in the standard. All of the above is implemented via constexpr where possible.
So as one can see it's actually pretty simple.
It's only now that I've "finished" it, that I see how simple it is.
Steven's exhaustive line by line review of the code is going to be a very tough act follow. This appeared the second day of the review. Maybe that intimated people - it would me. He's pointed out errors which I've agreed to fix. so it's not clear that repeating this process, though it woudn't hurt, might be somewhat redundant.
Perhaps you might take a different tack. I've spent a little time looking at the boost multi-precision library with the eye of incorporating it into the safe numerics testing. In the course of doing this a couple of interesting things occurred to me.
a) would safe numeric types inter-operate with safe numerics types? They should - but I haven't actually tested this. Whenever I fail to test something - there's almost always a bug in it.
b) would safe<T> work if T is one of the types defined by boost multi-precision? This is unclear to me. safe numerics presumes that the largest types available are std::uintmax_t and std::intmax_t . It's easy to imagine altering this presumption to use types defined by the user to be types like boost::uint512_t or... . I think this would work with minor changes - such a combination would open up whole new territory.
FWIW, this is something different, but I tested safe<T> with boost::rational, and at least a simple example works as expected: https://github.com/robertramey/safe_numerics/issues/34 Maybe it is worht adding in tests, examples, docs.
Maybe if you confined the scope of your review to issues such as the above you could save a lot of time and bring up issues that others are not in a position to do.
Finally, my biggest disappointment in all this has been my inability to get people to take this whole effort seriously. That is, to even admit that there is even a problem. It's inexplicable and disheartening to me that one can write something like x + y and not be confident that the result actually equals x + y. Especially in light of the fact that we're using C++ to make flying cars - not just websites. I feel that for the first time in 60 years we're in the position of making demonstrably correct software - and no one cares. It's very frustrating.
Maybe this is advertised incorrectly, or maybe there is a confusion about the expectations from this library. If one sees `safe<int>` one might think, "with this library using ints will be safer". But what does this mean? 1. I can use this library to *test* if my operations overflow? -- yes, it can do that 2. I can use it like asserts: redefine its semantics in "Debug" and "Release" builds, by swapping the policies. -- yes, it can do it; but does documentation say about it? 3. When I use this library, I no longer have to care about overflow bugs, because the library will take care for them for me (like with GC and memory management)? -- no, but is library clear about this? 4. When I use this library, I will never have an overflow again? -- no, but will the users understand that when getting in touch with your library? 5. Whan I use this library, I no longer have to check if I am dividing by zero? -- ok, the library will do th check, but that would actually reduce the quality, if you do not check for this yourself. There exists a trend where a piece of software is considered "safe" or "correct" only because every function has an artificially widened contract and does an if-statement up front and throws an exception if its (hidden) narrow cotract is violated. Does your library take or promote this approach or not? -- One cannot immediately see the answer from the documentation. Regards, &rzej;
AMDG On 03/11/2017 06:55 AM, Andrzej Krzemienski via Boost wrote:
2017-03-10 19:56 GMT+01:00 Robert Ramey via Boost
: 4. When I use this library, I will never have an overflow again? -- no, but will the users understand that when getting in touch with your library?
In theory, this is what safe
Le 10/03/2017 à 18:22, John Maddock via Boost a écrit :
I'm very, very concerned that there are only a very few reviews (actually really just one !!!). In the past I've railed against the acceptance of libraries with only two reviews !!! I don't really know what else to say about this. I'll just punt to the review manager.
I think the problem is this: normally we review largely based on interface and the design - get the design right and the internals usually take care of themselves. However, in this case the design is (hopefully) exceptionally uncontroversial - it looks like an int, smalls like an int, and behaves like an int. There really isn't much to get your teeth into there. What really matters is that:
* It's functionally correct. * It truly is a drop in replacement for type int, with no nasty surprises. * It's performance compared to int isn't so dreadful that no one uses it.
There is one major difference between int and safe<int>. While int operation don't throw and are seen as noexcept, safe<int> operations can throw and can not be declared as noexcept. It would be different if the policies were required to be noexcept, but I know that a lot of people wants to be able to throw exceptions. From my side, I prefer to assert in this cases. Best, Vicente
On 3/10/17 12:24 AM, Andrzej Krzemienski via Boost wrote:
Hi Everyone, We have three last days remaining for safe_numerics review (till March 11th). If you like working under time pressure, this is the perfect moment for doing your review.
I appreciate the numerous PRs comments etc. The are almost all correct, useful and reasonable and will be incorporated into the documentation and or code. Right not I'm doing this locally - there's a lot to do. I haven't updated the repository. This is intentional. I think it's important that we're all working from the same page. This is irksome as now I know that I'm flogging code which has some pretty serious bugs in it. But for purposes of the review this is the correct decision. Some time (shortly) after the review ends, I'll update the develop branch. If nothing blows up within the next 10 days or so, I'll merge to the master. I'm making this note so that people who have taken the trouble to submit PRs, reviews, etc. will know that the lack of obvious movement on these fixes is not a sign that they are being ignored. I just want to get this right. Robert Ramey
participants (7)
-
Andrzej Krzemienski
-
John Maddock
-
Niall Douglas
-
Paul A. Bristow
-
Robert Ramey
-
Steven Watanabe
-
Vicente J. Botet Escriba