
Andrey Semashev wrote:
On Wednesday 17 June 2009 14:40:20 Ilya Bobir wrote:
Andrey Semashev wrote:
Because apparently not everyone is happy with the current strategy. If the time period input and/or output facet will do error reporting in a non-standard way would not it be confusing? I thought that if I'm using a type with a "standard" input/output stream then it should behave in a "standard" way. Why would I want to go into time period error reporting details? What make it so different that as a user I really should spent my time reading the docs and why should I break my "normal" usage patterns for the time period or any other data time object? This is what I'm missing. What is so special about time periods? :)
I'm not sure what you're advocating for. What error reporting strategy do you call "standard"? In the discussed matter an assert is used - is it "standard" or not, in your terms?
I thought that there is a "standard" error reporting strategy supported by the C++ standard. There is an ios_base::iostate class (ISO 14882-2003 27.4.2.1.3) and a number of functions in the basic_ios class (ISO 14882-2003 27.4.4.3) that deal with iostate objects. Paragraph 2 of ISO 14882-2003 22.2 says the following:
The put() members make no provision for error reporting. (Any failures of the OutputIterator argument must be extracted from the returned iterator.) The get() members take an ios_base::iostate& argument whose value they ignore, but set to ios_base::failbit in case of a parse error.
It means that all default input facets set failbit in case of an error. Unless I'm missing something this is an offtopic on this thread. The title of this thread is "Change for the default duration format". And that question is on how an error should be reported by an input facet. This is why I'm asking what is so special about time periods? I mean why these quite distinct things are been discussed in the same thread?