
On Sun, Aug 23, 2009 at 9:45 PM, Nicholas Howe<nicholas.howe@gmail.com> wrote:
On Sat, Aug 22, 2009 at 7:57 PM, OvermindDL1 <overminddl1@gmail.com> wrote:
Hmm, interesting thought. What if all user-overloads where put in an mpl map, and you registered all overloadable functions in a similar way, but for the main function caller you could probably do something like variant does for its visitors and recursive down for each of the types based on the typeid (or more likely an mpl tag since it will be stuffed into an mpl map anyway) using fusion to resolve the type at runtime then using fusion::invoke or one of its kin, this would allow it to be very efficient and would remove basically all the indirection costs. Could be made even more simple and efficient if the user was willing to add a single macro into their class definition. Hmm...
I'm not certain, but it sounds like you might be suggesting resolving the mapping from dynamic parameter types to available methods at compile-time. If so, I explained why I don't like that above. If not, you'll have to go into more depth, as I don't follow. My implementation caches the result of lookups, so I'm not particularly concerned by the cost of the resolution.
I've looked over their documentation, but I'm not very familiar with variant and fusion. I was aware that boost has tuple libraries, but I didn't use them because I didn't think I was doing anything very complicated and, of course, it's easier not to learn. Then there's the excuse that keys have to cross library boundaries, so I want them to be part of the multimethod library, rather than a dependency, so they won't change unless they have to.
I might be able to whip up an example if you want, but what it would do is populate a tree of type resolvers (this would actually allow for unlimited type resolution, not just two) with plenty of use of type_traits is_virtual_child and so forth all stuffed into templated structs that resolve the lookups. At compile time the tree is created, at call-time into the multimethod it creates a lookup in the tree that is specialized on the types passed into the multimethod caller, and at runtime the specialized caller resolves through the tree to compare the compile-time types at runtime, and if equal or if proper children, then it is equal and calls the necessary method, it will not fail since the base-most test could be tested at compile time as well and if it is not a child of that then it would fail at compile-time, but resolve to a specific function at runtime. I have been creating a lot of things like this recently for binding function between scripting languages and C++, makes things so easy once you figure out how to create it. I love Boost.Fusion. :)
On Sat, Aug 22, 2009 at 11:01 PM, Jeffrey Bosboom <jbosboom@uci.edu> wrote:
There's a Stroustrup et al paper on the feasibility of adding multimethods to C++ here: http://research.att.com/~bs/multimethods.pdf . I'm not sure how relevant it is to the current proposal, but it contains information on an experimental implementation that was faster than double dispatch.
I will read that. I believe my proposal defines a simple interface for defining type relationships and registering methods, while leaving plenty of room for implementation changes.
Just for note, if you do read the PDF, he describes a method that makes multi-methods have less cost then two virtual functions, and in many cases it has equal cost to one virtual function program-wide, so basically, much faster then your implementation, but his looks like it might require a compiler back-end change.