
On Sat, Oct 22, 2005 at 09:21:08PM +0100, Bronek Kozicki wrote:
Oliver Kullmann wrote:
I have the C99 standard (as a book), while the C90 standard (I thought it would be C89?) seems to cost a fortune.
C++ explicitly refers to ISO/IEC 9899:1990 . You may call it C89, I will to consistently call it C90. The point remains that C++ does not refer to any newer version of the C standard.
I forgot about that technical corrigendum issue, so you are right, it should be C90. The problem with the C++ reference to the C90 standard is, that the C90 standard costs currently $103.10 (http://webstore.ansi.org/ansidocstore/product.asp?sku=AS+3955%2D1991) (actually, I believe the price used to be even higher). There are preliminary versions out there, for example http://monolith.consiste.dimap.ufrn.br/~david/ENSEIGNEMENT/SUPPORT/n843.pdf I looked up the documentation of the fscanf function there, and it looks like as I expected it to be: The C99 standard only makes things more precise.
I believe that C90 that C++ standard refers to, does not mention UB.
But then it should be the case that the C99 standard only makes more precise what the older standard left out?
No. The wording in C90 standard is actually important.
Sure, for the language lawyers. But my point here is to argue that it is NOT POSSIBLE to read integers in C++ from an input stream without running into undefined behaviour (if we do not have perfect control over the size of numbers), and w.r.t. this the C90 standard is just worse than the C99 standard.
It simply leaves this question not-standarized and the C++ standard does not add anything in this respect.
The only difference between "not-standardized" and "undefined behaviour" is, that in the latter case we are at least conscious about it, while in the former case we have no clue (and closing the eyes before a problem doesn't usually solve the problem).
However, if (or rather when) C++ is updated to refer to the newer version of the C standard, it will have to be considered.
I think we should consider it rather now that if we only relying on the standard, and not on the compiler, that then int n; std::cin >> n; should not be used. Within the Boost library perhaps the most appropriate place to address this issue would be with boost::lexical_cast. Perhaps one could have a checked version; or at least the test suite of the Boost library could contain checks whether the compiler seems to correctly handle reading of integers (and other fundamental types) too big to be represented. Oliver