Interest? Virtual preprocessor functions and more

This email actually introduces a few different concepts, so at least skim the following paragraphs if the first couple of topics don't capture your interest. As a summary, I am introducing manipulatable preprocessor function and function template descriptions, a macro which automatically generates a "result" template for use with result_of that does not rely of typeof (based off of such function descriptions), the possibilty of macros which use function descriptions to generate tempalte metafunctions which can yield if a function object call is ambiguous or if there is no match for provided parameter types, and finally an "object" concept for preprocessor metaprogramming which allows for simulation of "virtual" or "overloaded" macros which could potentially be used to join Boost.Preprocessor containers by "overloaded" macros, allowing for fewer named preprocessor macros exposed to the user and opens the possibilty up for more generic preprocessor algorithms. _____ Preprocessor function descriptions: While developing my current library, I had to come up with a way to describe functions, member functions, function templates, and member functions in such away that they could be examined and manipulated easily by the preprocessor, ideally all with the same generic code. Such descriptions contain named template parameters in the case of a template, a specifier, a return type, named function parameters, and a qualifier in the case of member functions. Using macros, programmers can then easily change the name of the function, change the return type, transform paramters in order to change their types and/or name, and automatically generate function headers. First, is there any interest for such macros outside of use from my library. For clarity, usage is as such: // where int isthe return type, test is the name, float left is the first parameter, and float right is the second parameter #define A_FUNCTION BOOST_FUN( (int), test, ((float),left)((double),right) ) BOOST_FUN_IMPL_HEADER( A_FUNCTION ) { return static_cast< int >( left + right ); } Uses for this primarily include things such as automatic creation of forwarding functions which may alter parameter types, but also allow for several more interesting applications. ___________ That brings me to perhaps a more common use. Using this little sublibrary of functions, I am making a result template generator for function objects which allows a user to very easily create a result metafunction which is automatically defined, without the need for typeof. For instance: struct fun_obj { BOOST_RESULT( ( BOOST_MEM_FUN( (int), operator (), ((float),left)((double),right) ) ) ( BOOST_CONST_MEM_FUN( (int), operator (), ((float),left)((double),right) ) ) ( BOOST_MEM_FUN_T( ((typename),L)((typename),R), (float), operator (), ((L),left)((R),right) ) ) ) // rest of object body }; // result is equal to float typedef result_of< fun_obj( float, float ) >::type result; ____________ Going further, a macro could easily be created which also produces a metafunction that tells you if a function object call is ambiguous, and a macro could be created which produces a metafunction which tells you if a function object call has no matches for the provided argument types. Such metafunctions could potentially improve error reporting, particularly for forwarding functions. ___________ Finally, if none of that is of immediate interest, the underlying mechanism for all of this may be. What I do in order to get polymorphic behavior for different kinds of functions (normal functions, member functions, and templates), is I use a form of object system. The concept is simple, but is applicable to a large amount of areas in preprocessor metaprogramming, and I offer it as a potential fundamental change to the way Boost.Preprocessor is interfaced with by simulating macro overloading, allowing for fewer named functions but with the same functionality and for more simple creation of generic preprocessor algorithms. The concept is simply this: An object is represented by the form ( OBJECT_ID, (object_data) ) Where OBJECT_ID is a unique "type" for the object and object_data is the internal implementation fo the container. The way it works is you "construct" an object by calling a macro which takes in the data and assembles it into the object representation just described, then, internally when calling overloaded" macros arguments are forwarded to a macro which internally concatenates the function name to the OBJECT_ID and forwards the arguments once more along with the internal representation of the object. The result is a seemingly "overloaded" or "virtual" macro. As an example of how this could be extremely useful for Boost.Preprocessor is it allows for an easy way to represent consistent container concepts. For instance, as Boost.Preprocessor stands, common container operations for arrays and sequences use different names, making it difficult to work with containers generically as you would in the STL or even in MPL. However, this could all be changed, and done so by merely building on top of code that is already there. As an example, here is a BOOST_PP_FRONT macro that works with either BOOST_PP_ARRAYs or BOOST_PP_SEQs (and could be easily be made to work with lists as well). // Seq constructor #define BOOST_PP_SEQ( seq ) ( BOOST_PP_SEQ_OBJ, (seq) ) // Seq FRONT implementation #define BOOST_PP_SEQ_OBJ_FRONT( seq ) BOOST_PP_SEQ_HEAD( seq ) // Array constructor #define BOOST_PP_ARRAY( size, tuple ) ( BOOST_PP_ARRAY_OBJ, ((size,tuple)) ) // Array FRONT implementation #define BOOST_PP_ARRAY_OBJ_FRONT( array ) BOOST_PP_ARRAY_ELEM( 0, array ) // FRONT implementation #define BOOST_PP_FRONT( obj ) BOOST_PP_NULLARY_VFUN( obj, FRONT ) // And here is how the code is used: #define SOME_ARRAY BOOST_PP_ARRAY( 5, (a,b,c,d,e) ) #define SOME_SEQ BOOST_PP_SEQ( (a)(b)(c)(d)(e) ) int main() { // int a; int BOOST_PP_FRONT( SOME_ARRAY ); // a = 5; BOOST_PP_FRONT( SOME_SEQ ) = 5; } __________ Of course, front is a simple operation, but the same technique can be used to join all common container operations. For instance, a generic BOOST_PP_FOR_ALL macro could be made as well as generic transformation algorithms, etc. No longer would you have to be concerned about which preprocessor container types you are dealing with and you don't have to explicitly append the container name to container macros. With this functionality, Boost.Preprocessor could be described in a much more familiar sense to generic C++ programmers, through concepts, models, and overloaded macros. -- -Matt Calabrese

-----Original Message----- From: boost-bounces@lists.boost.org [mailto:boost-bounces@lists.boost.org] On Behalf Of Matt Calabrese
Everything above here is not suited to being part of the pp-lib. Not that there is anything wrong with it (necessarily), just that it is a *use* of the preprocessor library rather than a potential *part* of it.
Finally, if none of that is of immediate interest, the underlying mechanism for all of this may be. What I do in order to get polymorphic behavior for different kinds of functions (normal functions, member functions, and templates), is I use a form of object system. The concept is simple, but is applicable to a large amount of areas in preprocessor metaprogramming, and I offer it as a potential fundamental change to the way Boost.Preprocessor is interfaced with by simulating macro overloading, allowing for fewer named functions but with the same functionality and for more simple creation of generic preprocessor algorithms.
The concept is simply this: An object is represented by the form
( OBJECT_ID, (object_data) )
Where OBJECT_ID is a unique "type" for the object and object_data is the internal implementation fo the container.
Chaos does something similar to this with container data types. E.g. CHAOS_PP_SIZE( (CHAOS_PP_ARRAY) (3, (a, b, c)) ) // 3 CHAOS_PP_SIZE( (CHAOS_PP_LIST) (a, (b, (c, ...))) ) // 3 CHAOS_PP_SIZE( (CHAOS_PP_SEQ) (a)(b)(c) ) // 3 CHAOS_PP_SIZE( (CHAOS_PP_STRING) a b c ) // 3 CHAOS_PP_SIZE( (CHAOS_PP_TUPLE) (a, b, c) ) // 3
The way it works is you "construct" an object by calling a macro which takes in the data and assembles it into the object representation just described, then, internally when calling overloaded" macros arguments are forwarded to a macro
Chaos also provides facilities to simulate overloaded macros, default arguments, and optional arguments. E.g. #define MACRO_1(a) -a #define MACRO_2(a, b) a - b #define MACRO_3(a, b, c) a - b - c #define MACRO(...) \ CHAOS_PP_QUICK_OVERLOAD(MACRO_, __VA_ARGS__)(__VA_ARGS__) \ /**/ MACRO(1) // -1 MACRO(1, 2) // 1 - 2 MACRO(1, 2, 3) // 1 - 2 - 3
which internally concatenates the function name to the OBJECT_ID and forwards the arguments once more along with the internal representation of the object. The result is a seemingly "overloaded" or "virtual" macro. As an example of how this could be extremely useful for Boost.Preprocessor is it allows for an easy way to represent consistent container concepts.
The problems with using something like this in the pp-lib are efficiency and scalability. It takes time to process sequences of elements generically. There can be significant overhead compared to just directly using an algorithm designed to operate on a particular data type. Nor can you have a simple direct dispatch mechanism, as this requires you to write many algorithms each time that you add a new data type. It also doesn't deal with algorithms that take multiple sequences of elements as input: CHAOS_PP_APPEND( (CHAOS_PP_SEQ) (a)(b)(c), (CHAOS_PP_LIST) (x, (y, (z, ...))) ) // (CHAOS_PP_SEQ) (a)(b)(c)(x)(y)(z) The way that Chaos does it is that it defines the algorithms themselves generically relying only on a small set of core primitives per data type (i.e. HEAD, TAIL, IS_CONS, etc.). Then, when there is a significantly more efficient (or interesting) way that a particular algorithm can be designed for a specific data type, it provides that algorithm non-generically. The main problem is efficiency. The kinds of programming (relative to the complexity of the problem to be solved--not the complexity of the solution) where generics would be useful are the kinds of programming where the inefficiencies really add up. Regards, Paul Mensonides

"Matt Calabrese" <rivorus@gmail.com> writes:
This email actually introduces a few different concepts, so at least skim the following paragraphs if the first couple of topics don't capture your interest.
Suggestion: make your troll for interest post much shorter and more concise if you want a higher response rate.
As a summary, I am introducing manipulatable preprocessor function and function template descriptions,
Useful.
a macro which automatically generates a "result" template for use with result_of that does not rely of typeof (based off of such function descriptions),
Useful.
the possibilty of macros which use function descriptions to generate tempalte metafunctions which can yield if a function object call is ambiguous or if there is no match for provided parameter types,
Not sure about that one. If it *is* useful, perhaps you can generate the enable_if that will simply make the operator() nonexistent... or are they not always templates? Anyway, this sounds like it overlaps with the parameter lib quite a lot. -- Dave Abrahams Boost Consulting www.boost-consulting.com
participants (3)
-
David Abrahams
-
Matt Calabrese
-
Paul Mensonides