
Yes. Currently it's:
is_function_type<const_member_function_pointer,T>::value
However, I will change this. So it will be:
is_const< function_type_class<T>::type >::value
which is what you propose and way more practical. I use a wrapper to model this in many places and I really want to get rid of it.
OK that looks cleaner.
Reading the docs, I don't understand what function_type_signature is for, especially when it's members are not recomended for general use. Should this be an implementation detail?
No, but the docs need to be fixed, here.
'function_type_signature' a model of MPL Random Access sequence (even an "extensible" one if the synthesis part has been enabled by #including 'function_type.hpp').
The 'kind' member is the "missing piece of information" to fully describe the encapsulated type (in combination with the subtypes stored in the sequence).
The 'representee' member is neccessary to get the type which it (e.g. a modified sequence) describes.
Further (what follows is a preliminary try to improve this recommendation):
'function_type_signature' can be used to achieve similar functionality as provided by the decomposition components.
This application, however, is not recommended (*), because there is no implied assertion that its template argument really is a function type and because its type members form a less expressive interface for decomposistion.
(*) It can be an opportunity for optimization in heavily repetitive situations to reduce the number of template instantiations required or to avoid to depend on the synthesis part of the library if an extensible sequence is needed which does not have to be reassembled to a function type.
Understood.
function_type: OK I understand what it does, but I'm having a hard time figuring out what it would be for. It's one thing to be able to create a specific function type, but then you have to be able to do something with it ;-)
Like taking the address of an overloaded function or function template instantiation ??
Surely you can always just declare your function type as a typedef in that case, without going through function_type? function_type is only useful in that it unpacks an mpl sequence defining the argument list into the function type, but that's only useful if you have an the mpl sequence to begin with. It's those situations I'm finding hard to imagine at present.
Or another one:
Let's say we have some function type and want to create a template argument for Boost.Function with an optimal forwarding signature (i.e. use call_traits<T>::param_type to compute the parameter types). For member function pointers we will parametrize the transformation applied to the class type (and set the default to add a reference):
OK, you mean take a function sig, decompose to mpl sequent, transform the sequence, and recompose again? I think I can understand that, but doesn't this get complicated: presumably the transform should be applied only to the arguments and not the return type and class type (if present) which complicates the code no end unless I'm mistaken.
// MPL friendly wrapper for call_traits<T>::param_type template<typename T> struct param_type { typedef typename call_traits<T>::param_type type; };
// Metafunction to compute an optimized forwarding signature template < typename FunctionType , typename ClassTypeTransform = add_reference<_1> > struct forward_signature_function_type : function_type < plain_function , typename mpl::copy < function_type_signature<FunctionType> , mpl::inserter < mpl::vector<> , mpl::push_back < _1 , mpl::if_ < mpl::greater < mpl::size<_1> , typename mpl::if_ < is_function_type<member_function_pointer,FunctionType> , mpl::size_t<1> , mpl::size_t<0> >::type > , param_type<_2> , mpl::if_ < mpl::equal_to< mpl::size<_1>, mpl::size_t<1> > , mpl::apply1 < typename mpl::lambda<ClassTypeTransform>::type, _2 > , _2 > > > > >::type > { };
OK, OK - I admit, this implementation is ugly.
It's hard to grok. It's basically a whole mpl tutorial in it's own right ;-)
What is your evaluation of the implementation?
Looks OK as far as I've been able to judge.
Some commenbts on the tests: I see lots of tests for is_function_type and very little for the composition and decompostion functions. I'd like to see the same sort of thoughoughness that's gone into arity_and_type.cpp applied to the other templates (probably one test for each).
As I plan to extend portability, I won't get around writing more proper regressions.
Type synthesis needs better testing most badly.
Currently, there is not that much to test in terms of decomposition. Correct arity and tag means 'signature_impl' (the heart of all classification and decomposition) works. Of course I could test every type, but it's really just template specialization and there is little chance for typos in generated code.
The "tag logic" is based on binary arithmetic and the part of the encoding that is edited by hand is quite small. Errors in there are caught pretty well by the "example/test hybrids". But they are by no means perfect, yet.
Maybe, but my experience with type traits is that: "If the tests all pass, then you haven't written enough of them yet." Of course a lot of the failures that show up are compiler problems :-(
I agree. In fact, I'm already collecting more illustrative code snippets.
Yes, I spotted that, looks like you've got it in hand.
What is your evaluation of the potential usefulness of the library?
This is the real problem, I can see how a function's return type and arity may be useful, but I'm having a hard time figuring out what access to the type of specific function arguments is good for,
OK. Let's start with the conservative side of the story:
Of course we can always spell things out like this:
template<typename R> void do_something_with_function(R (*)()); template<typename R,typename A1> void do_something_with_function(R (*)(A1)); template<typename R,typename A1,typename A2> void do_something_with_function(R (*)(A1,A2)); //...
And in this case we know all paramter types, of course.
When doing complex things we end up writing several of these cascades (as we usually don't get all work done in one function), writing code generators to help us with it, add configuration options for __stdcall, __fastcall, etc. or forget them. The neighbouring library does the same thing.
What's "neighbouring library"?
If we can completely decompose function types we won't need to do this. We can just use a simple template function instead:
template<typename FuncPtr> void do_something_with_function(FuncPtr f) { // maybe asserts f really is a function pointer }
Well, we won't get rid of repetitive parts entirely, because of the invocation.
However, we can reduce them. This code reduction is especially significant when we are supporting cv-qualified member function pointers or several different calling conventions. We don't have to deal with this stuff anymore at all because the invocation doesn't care. The decomposition facility handles it for us.
I can see that argument for decomposing the arity, I'm less sure about the others, I've got no imagination I guess ;-)
But this it's not only about reducing thousand lines of code, supporting exotic things like variadic functions or non-standard calling conventions and a common point of configuration for it - it's about software design: making design choices is much easier knowing there is a choice, so the existence of library is going to influence future design decisions.
OK.
Nice words to shift over to some more progressive things like (what I call) "intelligent" callback facilities: The basic idea is to generate functionality that feeds a function with arguments based on its signature.
The interpreter example tries to illustrate this by providing a limited command line interpreter that allows to call previously registered C++ functions and simple functors (lexical_cast is applied to input from a token_iterator to feed the function). Taking this idea a bit further, making it e.g. a spirit parser with parametrized (per-parameter-type) argument parsing, also handling the result, would make it a matter of minutes to layer a functional scripting languages over existing C++ code.
An event processing framwork could use the same technique to hand event handler functions the data they are interested in, for example.
OK.
likewise what the mpl-list based composition/decomposition functions are for.
E.g:
o Setting up a table of function signatures and taking the address of a function overloaded this way by index.
o Concept checking (does my function take "key_t, value_t" n-times ?).
o Doing transformations (like the weird code above does - I guess there are better examples out there).
o Writing a traits class to probe the number of default arguments of a function with a particular name (operator() would be a good one, I guess).
Better motivating examples may help, however there is a seems to be substantial overlap between the functionality offered here and boost::bind, boost::function etc. Is this library intended to simplify those libraries?
Not exclusively.
There is potential to improve existing Boost code, already. But this library must become more portable to be applied universally.
Not much to do for Function, since it doesn't really do a lot with the types (it either uses MemFn or a cast to void*). Once the portability matches, three lines (or so) can be changed to use this library for classification (it'll work for __stdcall and friends then, given FunctionTypes is configured to support them).
Lambda and Phoenix will benefit from using this library - and the ResultOf utiltiy too, of course.
I'm not sure on Bind as a whole, yet. boost::mem_fn (which seems to be a part of Bind) is a candidate, though.
If so what do those libraries authors think?
I'ld really love to know!
In fact, AFAIK you are one of them (maintainer, at least) as there is quite some overlap with TypeTraits. However, FunctionTypes is not portable enough for this, yet. And when it is, it will require a lot of precision not to end up with circular dependencies. Currently only remove_cv is used directly so there is a chance it's doable...
The main overlap is with function_traits I guess: that was always a bit of a stop gap so that function and signals could just get the job done. I'm not sure how the overlap with is_function/is_member_function_pointer goes, and/or which implementation is the more "lightweight" (probably not much in it), if function_traits could produce versions of these traits that passed all the existing tests then that would be a big portability thumbs up for function_types. I've certainly no objection to recasting those traits in terms of function_types if it simplifies things. One thing I've been meaning to look into, but haven't had the time yet: how do you handle __stdcall/__fastcall etc? Last time I tried to write partial specialisations that would select function types with specific calling conventions I couldn't get it to work. There's also the horrible __thiscall thing: you can't explicitly declare a member function pointer type with __thiscall, but it is usually (but not always) the default, so whether or not: void (myclass::*)(); and void (__cdecl myclass::*)(); are the same type or not, depends upon the command line arguments to VC, and you can't tell what those options are at compile time as far as I know. John.