
Andy Little writes:
"Aleksey Gurtovoy" <agurtovoy@meta-comm.com> wrote in message news:m13bcp56mg.fsf@meta-comm.com...
Andy Little writes:
IMO the Integral Constant Concept is over specified in the mpl docs. What is the rationale behind the next<..> and prior <..> requirements?
The corresponding operations are handy and available for every integral type?
I am sure no expert on Concepts, but my understanding is that a Concept should specify the minimum requirements,
There is no absolute notion of the minimum set of requirements. For one, the abstract minimum is of course having no requirements at all, and obviously that would hardly give us a useful set of concepts. A meaningful minimum can only be derived from the relevant use cases, and it's pointless to discuss one without having those on the table.
not the maximum or try to include anything that *might* come in handy, which would anyway be impossible.
I didn't say these are included because they *might* come in handy. They are included because they are proven to be handy, are heavily exercised throughout the library, and are almost exclusively the only arithmetic requirements the library itself puts on the concept's models.
next and prior can be trivially specified in terms of the math operations of addition and subtraction.
Assuming that Integral Constant is required to provide the latter, which it doesn't. We can discuss whether it should, but I want to make sure you understand that you are actually arguing towards putting a heavier set of requirements on the concept.
IOW if a type t is an Integral Constant and is Addable then next<t>::type can be generated automatically,
At the price of a (much higher) performance overhead, depending on what Addable is.
but the basic math, comparison and logical operations, arent in the Integral Constant requirements, at least in boost-1.33.1.
Yep, they are not.
A bool_ is stated to be a model of Integral Constant, but it patently doesnt and can never meet the next / prior requirements.
Of course it can.
It is arguable whether a bool should be a model of Integral Constant anyway ( in fact I believe that there should be a separate Boolean Constant and the Boost type traits functions should use a model of that where appropriate rather than integral_constant), but that depends on how the Concept is specified. If Addition is specified for example, then the semantics are different for boolean types ( and for a bool_ next and prior is ambiguous then),
How so?
a good argument for not making bool_ a model of integral constant in that case.
Or a good argument not to burden Integral Constant with Addition.
If a bool_ is a model of Integral Constant then Addition should not be specified for Integral Constant and by implication next/ prior should not be specified either as they are just special cases of the more general Concepts.
The current use cases indicate that they are in fact very important special cases, important enough to warrant support on their own.
In my use of integral constants, comparison for equality is the most used operation, followed by arithmetic. I have never used the next ,prior functions. To me math ,comparison and logic operations are more likely candidates for Integral Constant Requirements.
IMO they are too heavy-weight. I understand your desire to have a concept encompassing these, but so far I don't see a compelling reason why Integral Constant should be such a concept (and I find it somewhat amusing that the title of this thread is "Integral Constant is over specified" :).
Some of these are required by both bools and integers and some not. The point where the requirements or semantic differ is probably the level of detail that the Concepts should be specified at.
These seem to have little to do with an Integral Constant.
IMO that's equivalent to saying that operations of increment and decrement have little to do with a concept of an integer number.
increment and decrement are more precise terms than next and prior but they can be trivially specified in terms of the usual math operations, addition and subtraction. That is given a number is Addable and Subtractible, it is automatically Incrementable and Decrementable.
I believe I've commented on this point earlier.
It should also be stated that these constants arent perfect models of integers, what happens in the case of math on constants of different value_types , whether a given math operation is possible ( without overflow) and so on.
Stated where?
Maybe they should rather be another Concept , such as Iterable?
They _could be_ factored out in a separate concept, but that doesn't automatically means denying users the guarantee of having convenient access to the basic operations like increment/decrement.
They arent denied anything, if, in places where such functionality is required, that it is stated that X is required to be a model of Iterable.
I don't think that such level granularity will improve the library's usability. Every integral constant _is_ Iterable; whether X excercises that capability or not is often an implementation detail. (That's not to say that having an Iterable concept won't be an improvement.)
FWIW my use has never needed such a requirement. I do however usually need comparisons and math, but again not math in the case of boolean constants.
Where N is a model of Iterable:
next<N>::type prior<N>::type
Currently for example, mpl::bool_ is stated to be a model of Integral Constant, but it fails to meet the curently stated requirments, nor can it ever AFAICS.
Surely this could work and would make sense, no?
BOOST_MPL_ASSERT(( is_same< next<false_>::type, true_ > )); BOOST_MPL_ASSERT(( is_same< prior<true_>::type, false_ > ));
How often do you need to do this ?
Rarely, but I'm yet to see a compelling reason to disallow it either.
and what does next<true_>::type mean?
It's unspecified?
true and false arent numbers. false is not less than true. (boolean != binary before anyone makes the link)
So it should be impossible to sort a container of booleans, then?
Further the requirements currently include a member ::value_type. According to the TMP book, this is a classic example of a traits blob. (section 2.2).
In a way.
OK. The book goes on to say, "we will avoid this idiom at all costs, since it creates major problems."
I'd have gladly went with 'value_type' metafunction if the name wasn't already taken. I'm not a big fan of 'value_type_of' convention (and there is no precedent of such in the library). Suggestions are welcome.
Surely access should be specified using value_type<C>::type?
It could be (ignoring for the moment the fact that the 'value_type' name is already taken), but it has nothing to do with the presence of the requirement, in whatever form, in the concept.
It was suggested to me that I use the mpl Integral Constant Concept in my own work. I am also attempting to use the mpl Concepts as a guide to good style. I am examining Integral Constant from this viewpoint.
Understood.
I am currently trying to write Concepts for my own types parameters.. Bearing in mind that the inadequate Concept documentation was one of the main reasons that PQS was rejected., so now I am trying to get it right. One concept in the quan library is currently called StaticAbstractQuantity:
One goal of this Concept is that it should be possible to make a raw mpl::vector a model of StaticAbstractQuantity. In order to do this I have opted to use 'free' metafunctions for the associated types. For other developers working on the project I need to explain why this is a good idea. Of course developers can point to examples like this and say, mpl doesnt do that. Why should we?
I'm more than willing to work with you on fixing these cases.
Theoretically, we could factor out every single requirement in its own concept, but that doesn't necessarily going to improve the quality of the resulting concept language.
Mixing orthogonal Concepts has resulted in the problems with bool_.
Personally, I think that the only problem with bool_ is that its implementation doesn't confirm to the docs.
By factoring out the separate Concepts it is possible to see which apply to bool_ and which don't.
Finally the runtime evaluation requirement could be removed and a refinement of ValueType such as RunTime Evaluable be stated in terms such as:
Where T is a type modelling Runtime Evaluable
value_type<T>::type = t();
Again, I don't see any value in denying users of the concept access to the functionality that is by definition available for every possible model of the concept.
By making this a separate Concept, I could write algorithms which only have the Runtime Evaluable requirement. They dont need to be members of Integral Constant.
Right; I'd be happy to work on such concept once we have at least one use case for it. In any case, you said that "the runtime evaluation requirement could be removed", and that's what I was (and am) objecting to, not factoring it out into a separate concept per se.
A compile time rational could be evaluated at runtime for example, but it is not a model of Integral Constant.
This value_type<T>::type = t(); is not going to work for a compile time rational. Finding out what will is a job that requires both time and actual, not hypothetical, cases at hand. If you have them, please present and we can work on it.
If the only requirement is Runtime Evaluable, both rational and int_ are models.
For a type N The current Integral Constant spec would then, in those places where these requirements are required:
N is a model of Iterable, Integral Constant and Runtime Evaluable.
I don't see how this is an improvement, given that IMO every integral constant is by definition Iterable and Runtime Evaluable.
Sure and there is a lot to be gained by identifying these as separate Concepts.
Proposals are welcome! -- Aleksey Gurtovoy MetaCommunications Engineering