[proto] minor issue: unnecessary typedefs _arg*

Hi! This is a minor issue, but nevertheless I'd like to share my opinion on this: The longer I think about _arg0 and its brothers and sisters the more I get the impression that they should be banned from proto. 1. Especially the typedefs for _arg, _left and _right should be removed from the library, because they duplicate the interface and bring more confusion than convenience. While rereading through the docs I tried to replace those typedefs with their arg_c<> substitutes before my inner eyes - et voila, the approach to the design is much easier now. I think _left and _right are in for historic reason (you always start thinking about ET as binary trees, e.g. while reading Veldhuizen's paper), but one thing should have one name in a general purpose library. Also it is a very good approach to proto, to repeat 100 times every morning that any operator is nothing but an oddly written function call. For function calls it makes no sense to talk about left and right. So a minor argument is didactic clarity. The major argument is: No fat interface here! This is not std::string, this is boost::prior(std::etl). Summary: I strongly vote against _arg, _left and _right. 2. From a user's point of view I still cannot see the advantage of using argSOME_NUMBER instead of arg_c<SOME_NUMBER>. Couldn't you make arg_c<XX> a first class citizen of namespace proto? So even if it makes it easier to write the boilerplate parts of proto itself, names based on MACROS are still evil enough to remove them from the _interface_ and for me _argSOME_NUMBER is part of the interface. IMHO templates with integer arguments scale well enough not to introduce PP garbage here. I have the suspicion that _argSOME_NUMBER is only inside because you have all those BOOST_PP_MAKE_MY_CODE_WRITE_ONCE_NEVER_UNDERSTAND_LATER. Note that this is not a show stopper for me, I propose the ban, but in contrast to _arg, _left and _right I can live with those, since transform::arg_c<I> is at hand and I will use those right from the beginning and ban _arg0 from my own code consequently. <kidding>Eric, you will only have to pay for all those extra "transform::" I have to type all the time. Let's say $0.02 for every occurence ... OK?</kidding> Eric, please classify right from the beginning, whether issue #2 takes on "bike shed characteristics" and/or disclose your rationale for _argSOME_NUMBER. Markus

Markus Werle wrote:
Hi!
This is a minor issue, but nevertheless I'd like to share my opinion on this:
The longer I think about _arg0 and its brothers and sisters the more I get the impression that they should be banned from proto.
1. Especially the typedefs for _arg, _left and _right should be removed from the library, because they duplicate the interface and bring more confusion than convenience. <snip>
Summary: I strongly vote against _arg, _left and _right.
I tend to disagree about _arg, _left, and _right. Should I also remove the free functions proto::arg(), proto::left() and proto::right()? It's true, you can use arg_c() for everything, but IMO you have to look closely to see and appreciate the difference between "arg_c<0>(expr)" and "arg_c<1>(expr)", but left(expr) and right(expr) are immediately recognizable and readable. And most C++ operators are binary.
2. From a user's point of view I still cannot see the advantage of using argSOME_NUMBER instead of arg_c<SOME_NUMBER>. Couldn't you make arg_c<XX> a first class citizen of namespace proto?
I can't because proto::arg_c already exists, and it's a free function. However, there is proto::_arg_c<>, which is a synonym for transform::arg_c<>. A word about that. Proto reuses names. proto::X is a function, proto::result_of::X is a metafunction that computes the result type of proto::X, proto::functional::X is the function object equivalent of X, and proto::transform::X is the primitive transform equivalent of X. You can't import all these namespaces without making X ambiguous. For the most part, you don't want to, but transforms are different. You very often want to be able to refer to arg_c the function and arg_c the transform in the same bit of code. Qualification like "transform::arg_c" is tedious and makes transforms hard to read. That's why the transforms have equivalents in the proto namespace with a leading underscore. I've always felt this is a little unsavory, but I haven't thought of anything better. I'm open to suggestions.
So even if it makes it easier to write the boilerplate parts of proto itself, names based on MACROS are still evil enough to remove them from the _interface_ and for me _argSOME_NUMBER is part of the interface.
Macros? These are not macros, they're typedefs.
IMHO templates with integer arguments scale well enough not to introduce PP garbage here. I have the suspicion that _argSOME_NUMBER is only inside because you have all those BOOST_PP_MAKE_MY_CODE_WRITE_ONCE_NEVER_UNDERSTAND_LATER.
???
Note that this is not a show stopper for me, I propose the ban, but in contrast to _arg, _left and _right I can live with those, since transform::arg_c<I> is at hand and I will use those right from the beginning and ban _arg0 from my own code consequently. <kidding>Eric, you will only have to pay for all those extra "transform::" I have to type all the time. Let's say $0.02 for every occurence ... OK?</kidding>
So you also feel that typing transform:: over and over is a pain!
Eric, please classify right from the beginning, whether issue #2 takes on "bike shed characteristics" and/or disclose your rationale for _argSOME_NUMBER.
I'm really sorry now about that bike shed comment. :-( I may be wrong, but it's possible that after you write some transforms, your opinion on this issue will change. The qualification and the angle brackets really obscure what a transform is doing. Consider the difference between these two: Use only transform::arg_c<>: struct MakePair : when< function<terminal<make_pair_tag>, terminal<_>, terminal<_> > , make_pair( transform::arg_c<0>(transform::arg_c<1>) , transform::arg_c<0>(transform::arg_c<2>) ) > {}; Use shorter typedefs: struct MakePair : when< function<terminal<make_pair_tag>, terminal<_>, terminal<_> > , make_pair(_arg(_arg1), _arg(_arg2)) > {}; They mean the same thing, but the second is a vast improvement, IMO. Proto uses function types to compose primitive transforms. Too many ::'s and <>'s break it up and obscure the meaning. If I get rid of the _argN typedefs, the examples in the documentation will look very imposing. -- Eric Niebler Boost Consulting www.boost-consulting.com

Eric Niebler wrote:
Markus Werle wrote:
Summary: I strongly vote against _arg, _left and _right.
I tend to disagree about _arg, _left, and _right. Should I also remove the free functions proto::arg(), proto::left() and proto::right()?
Yes.
It's true, you can use arg_c() for everything, but IMO you have to look closely to see and appreciate the difference between "arg_c<0>(expr)" and "arg_c<1>(expr)", but left(expr) and right(expr) are immediately recognizable and readable. And most C++ operators are binary.
OTOH I really got lost on my first reading due to the aliases everywhere.
2. From a user's point of view I still cannot see the advantage of using argSOME_NUMBER instead of arg_c<SOME_NUMBER>. Couldn't you make arg_c<XX> a first class citizen of namespace proto?
I can't because proto::arg_c already exists, and it's a free function. However, there is proto::_arg_c<>, which is a synonym for transform::arg_c<>. A word about that. Proto reuses names.
I tend to feel uncomfortable about this, but I have no perfect proposal to make it better. _arg_c<> is fine for me.
proto::X is a function, proto::result_of::X is a metafunction that computes the result type of proto::X, proto::functional::X is the function object equivalent of X, and proto::transform::X is the primitive transform equivalent of X. You can't import all these namespaces without making X ambiguous. For the most part, you don't want to, but transforms are different. You very often want to be able to refer to arg_c the function and arg_c the transform in the same bit of code.
I guess that is why I get lost all the time: I read the code in the docs and cannot guess from the name whether this is a typename or a function or whatever and which namespace it belongs. Just a daydream: What happens if it is this way: proto::X is a function, proto::result_of::X_r is a metafunction that computes the result type of proto::X, proto::functional::X_f is the function object equivalent of X, and proto::transform::X_t is the primitive transform equivalent of X. Do we loose anything? Do we run into trouble?
Qualification like "transform::arg_c" is tedious and makes transforms hard to read.
I agree. arc_c_t<1> then? hmm.
That's why the transforms have equivalents in the proto namespace with a leading underscore.
I've always felt this is a little unsavory, but I haven't thought of anything better. I'm open to suggestions.
Let me first try this week to live with proto as it is. It is obvious that proto is the 70th stage of rethinking the same thing over and over again, say: approximately perfect, so probably it is more a matter of getting used to it and how to document it. OTOH I really feel uncomfortable about those identical names in nested namespaces ... I want to rethink this. I once had some mightmare with ADL in a similar context, but I cannot remmeber exactly how it appeared and I am too stupid to invent an example where an abiguity drops in, so probably what I am saying here is FUD only and a matter of taste.
So even if it makes it easier to write the boilerplate parts of proto itself, names based on MACROS are still evil enough to remove them from the _interface_ and for me _argSOME_NUMBER is part of the interface.
Macros? These are not macros, they're typedefs.
I assumed they were generated by a macro until BOOST_PROTO_MAX_ARITY ... I was preaching against that (inexistent?) macro. Now I took a look at proto_fwd.hpp and found they are made by hand and stop at 9. So now I really have a problem: what happens if I need an arity of 120?
So you also feel that typing transform:: over and over is a pain!
OTOH today I always explicitly qualify with the namespace in my own code, using namespace alias if it gets too long ...
Eric, please classify right from the beginning, whether issue #2 takes on "bike shed characteristics" and/or disclose your rationale for _argSOME_NUMBER.
I'm really sorry now about that bike shed comment. :-(
Never mind. It is stored for eternity now, so we can come back to it whenever we want :-P
I may be wrong, but it's possible that after you write some transforms, your opinion on this issue will change. The qualification and the angle brackets really obscure what a transform is doing. Consider the difference between these two:
Use only transform::arg_c<>: struct MakePair : when< function<terminal<make_pair_tag>, terminal<_>, terminal<_> > , make_pair( transform::arg_c<0>(transform::arg_c<1>) , transform::arg_c<0>(transform::arg_c<2>) ) > {};
Use shorter typedefs: struct MakePair : when< function<terminal<make_pair_tag>, terminal<_>, terminal<_> > , make_pair(_arg(_arg1), _arg(_arg2)) > {};
struct MakePair : when< function<terminal<make_pair_tag>, terminal<_>, terminal<_> > , make_pair( _arg_c<0>(_arg_c<1>) , _arg_c<0>(_arg_c<2>) ) > {}; is OK for me, but I understand your intention. Those transforms kill me, but I need them. The advantage of those numbers is: proto is a typelist tool. If you see the index you do not mix it up. With _arg I always have this second, where I translate it back to _arg_c<0>, etc. At least during my approach to the docs those aliases did not help me (but I do not fear nested templates).
They mean the same thing, but the second is a vast improvement, IMO. Proto uses function types to compose primitive transforms. Too many ::'s and <>'s break it up and obscure the meaning.
Yes, templates should use [] instead of <>.
If I get rid of the _argN typedefs, the examples in the documentation will look very imposing.
If those _argN scale up to N=300 or more I will not vote against them. Otherwise if 9 is the maximum, I have a problem with that solution. Markus

Markus Werle wrote:
Eric Niebler wrote:
proto::X is a function, proto::result_of::X is a metafunction that computes the result type of proto::X, proto::functional::X is the function object equivalent of X, and proto::transform::X is the primitive transform equivalent of X.
I guess that is why I get lost all the time: I read the code in the docs and cannot guess from the name whether this is a typename or a function or whatever and which namespace it belongs.
The docs should use qualification where it could be unclear what a name refers to. I can make a pass through the docs to double-check, but if you can point to the places where a lack of qualification led you into trouble, I'll fix them.
Just a daydream: What happens if it is this way:
proto::X is a function, proto::result_of::X_r is a metafunction that computes the result type of proto::X, proto::functional::X_f is the function object equivalent of X, and proto::transform::X_t is the primitive transform equivalent of X.
Do we loose anything? Do we run into trouble?
Well, it would be inconsistent with established practice within Boost. Fusion, for one, uses a similar naming scheme. Spirit-1 used a naming scheme like the one you suggest (trailing "_p" means parser, trailing "_a" means action, etc.). Spirit-2 is dropping this in favor of namespaces, like Fusion and Proto. I believe that was in response to feedback from Dave Abrahams. Perhaps Dave or Joel could chime in on this issue. <snip> Re: the _argN transforms ...
I assumed they were generated by a macro until BOOST_PROTO_MAX_ARITY ... I was preaching against that (inexistent?) macro. Now I took a look at proto_fwd.hpp and found they are made by hand and stop at 9. So now I really have a problem: what happens if I need an arity of 120?
Ha! I imaging you'll need longer than the heat death of the universe to compile your program, but ... ... you're absolutely right, for the _argN transforms, N should go up to at least BOOST_PROTO_MAX_ARITY. I'll need to add a little PP magic.
struct MakePair : when< function<terminal<make_pair_tag>, terminal<_>, terminal<_> > , make_pair( _arg_c<0>(_arg_c<1>) , _arg_c<0>(_arg_c<2>) ) > {};
is OK for me, but I understand your intention. Those transforms kill me, but I need them.
I don't want my code to hurt or kill anybody. What about transforms is killing you? Is it something fixable, in your opinion? Or just the steep learning curve?
The advantage of those numbers is: proto is a typelist tool. If you see the index you do not mix it up. With _arg I always have this second, where I translate it back to _arg_c<0>, etc. At least during my approach to the docs those aliases did not help me (but I do not fear nested templates).
I see your POV, but I see it a different way ... Proto provides a domain-specific language for processing C++ expression trees. Within that domain, unary and binary expressions are the norm, so I provide abstractions for dealing with them in a more natural (IMHO) way. I'd be interested in other people's opinions.
If those _argN scale up to N=300 or more I will not vote against them. Otherwise if 9 is the maximum, I have a problem with that solution.
Agreed. -- Eric Niebler Boost Consulting www.boost-consulting.com

Eric Niebler wrote:
Markus Werle wrote:
Eric Niebler wrote:
proto::X is a function, proto::result_of::X is a metafunction that computes the result type of proto::X, proto::functional::X is the function object equivalent of X, and proto::transform::X is the primitive transform equivalent of X.
I guess that is why I get lost all the time: I read the code in the docs and cannot guess from the name whether this is a typename or a function or whatever and which namespace it belongs.
The docs should use qualification where it could be unclear what a name refers to. I can make a pass through the docs to double-check, but if you can point to the places where a lack of qualification led you into trouble, I'll fix them.
I am not sure about the reason why I do not understand transforms. Please give me some more time for this.
Just a daydream: What happens if it is this way:
proto::X is a function, proto::result_of::X_r is a metafunction that computes the result type of proto::X, proto::functional::X_f is the function object equivalent of X, and proto::transform::X_t is the primitive transform equivalent of X.
Do we loose anything? Do we run into trouble?
Well, it would be inconsistent with established practice within Boost. Fusion, for one, uses a similar naming scheme.
Spirit-1 used a naming scheme like the one you suggest (trailing "_p" means parser, trailing "_a" means action, etc.). Spirit-2 is dropping this in favor of namespaces, like Fusion and Proto. I believe that was in response to feedback from Dave Abrahams. Perhaps Dave or Joel could chime in on this issue.
Yes, please! I think this is a very important argument. So this probably is really only a documentation issue, which leads me to what I stated already some time ago: the value of knowing why something is done in boost the way it is done is higher than the mere existence of the library. A document "The design and evolution of boost::XXX" would boost C++ development even more than using the library itself. This is why boost rules say the documentation must contain a design rationale (wink, wink). Summary: you keep your names and I get used to it. Put this part of the library to "accepted" state, unless someone finds ways to choke the whole thing.
<snip>
Re: the _argN transforms ...
I assumed they were generated by a macro until BOOST_PROTO_MAX_ARITY ... I was preaching against that (inexistent?) macro. Now I took a look at proto_fwd.hpp and found they are made by hand and stop at 9. So now I really have a problem: what happens if I need an arity of 120?
Ha! I imaging you'll need longer than the heat death of the universe to compile your program,
Can you remember 1998, when it took the whole night to compile stuff that runs through in 5 minutes today? I remember generating fractals with fractint on a 386 DX 40 with a math coprocessor (boost!). One week, one image, my little son switching off the power supply on Saturday just before a very nice one had finished. Today bootstrapping gcc 5 times a day is nothing ... So I personally already have plans to put proto to the limits. Just waiting for what Intel or IBM have on their backburner. This is why I am a little bit concerned about the deep nesting namespaces, since names will become very long, but that's probably also a minor problem. Compilers evolve at the same speed.
but ...
... you're absolutely right, for the _argN transforms, N should go up to at least BOOST_PROTO_MAX_ARITY. I'll need to add a little PP magic.
Now it is me to blame for that a clean section of the code got its MACRO! ;-)
struct MakePair : when< function<terminal<make_pair_tag>, terminal<_>, terminal<_> > , make_pair( _arg_c<0>(_arg_c<1>) , _arg_c<0>(_arg_c<2>) ) > {};
is OK for me, but I understand your intention. Those transforms kill me, but I need them.
I don't want my code to hurt or kill anybody. What about transforms is killing you? Is it something fixable, in your opinion? Or just the steep learning curve?
The steep learning curve only. I am one of those who appreciates lovely step by step explanations like your FOREACH paper. That's how all things should be explained ... yes, I am asking too much here. I see that transforms will help to implement simplify algorithms, so I really need them. What kills me is to know it is there, but I get no grip on it.
The advantage of those numbers is: proto is a typelist tool. If you see the index you do not mix it up. With _arg I always have this second, where I translate it back to _arg_c<0>, etc. At least during my approach to the docs those aliases did not help me (but I do not fear nested templates).
I see your POV, but I see it a different way ... Proto provides a domain-specific language for processing C++ expression trees.
What I really like in proto is the expr<optag, typelist> archictecture. No tree at the front door.
Within that domain, unary and binary expressions are the norm, so I provide abstractions for dealing with them in a more natural (IMHO) way.
OTOH I was just about to request the ban of binary_expr from the docs. For you it is probably nice to have the shortcut, but again this is a shortcut from good old times when expressions had a maximum arity of 2. IMHO the basic expr<> typedef should be provided everywhere in order to ease mental access to the library. I found myself forcing compiler error messages, just to obtain the structure. Please rework "Expression Construction Utilities" such that you simply add the expr<> version where it is missing, e.g. at // expr_type is the same as this type: typedef proto::binary_expr< MyTag , proto::terminal<int>::type , proto::terminal<char>::type >::type expr_type2; BOOST_MPL_ASSERT((is_same<expr_type2, expr_type>)); PUT ANOTHER TYPEDEF AND ASSERT HERE OR BETTER: REPLACE
I'd be interested in other people's opinions.
Me, too.
If those _argN scale up to N=300 or more I will not vote against them. Otherwise if 9 is the maximum, I have a problem with that solution.
Agreed.
I think this is the beginning of a beautiful friendship :-) Markus

Markus Werle wrote:
A document "The design and evolution of boost::XXX" would boost C++ development even more than using the library itself. This is why boost rules say the documentation must contain a design rationale (wink, wink).
You mean like this? http://boost-sandbox.sourceforge.net/libs/proto/doc/html/boost_proto/appendi... (wink, wink) I guess I could add a blurb there about the naming convention. <snip>
What about transforms is killing you? Is it something fixable, in your opinion? Or just the steep learning curve?
The steep learning curve only. I am one of those who appreciates lovely step by step explanations like your FOREACH paper. That's how all things should be explained ... yes, I am asking too much here.
I see that transforms will help to implement simplify algorithms, so I really need them. What kills me is to know it is there, but I get no grip on it.
OK, I'll see if I can rework the transformation section to make it more approachable. This is an inherently hard topic, though.
I was just about to request the ban of binary_expr from the docs. For you it is probably nice to have the shortcut, but again this is a shortcut from good old times when expressions had a maximum arity of 2.
That happens not to be the case. And what's wrong with shortcuts?
IMHO the basic expr<> typedef should be provided everywhere in order to ease mental access to the library. I found myself forcing compiler error messages, just to obtain the structure.
Why? There are a lot of largely irrelevant details in the actual type of an expression, like the use of argsN<> and ref_<>. I don't see any advantage in exposing users to all that in the docs, or forcing users to type that in their code.
Please rework "Expression Construction Utilities" such that you simply add the expr<> version where it is missing, e.g. at
// expr_type is the same as this type: typedef proto::binary_expr< MyTag , proto::terminal<int>::type , proto::terminal<char>::type >::type expr_type2;
BOOST_MPL_ASSERT((is_same<expr_type2, expr_type>));
PUT ANOTHER TYPEDEF AND ASSERT HERE OR BETTER: REPLACE
I could do that, but these utilities are provided expressly so that users never have to be troubled with the actual expr<> type, which can be complicated. Like I said earlier, in NONE of the examples does the expr<> type appear. If you're dealing with expr<> directly in your code, you're working at a very low abstraction level. And your code is probably wrong ... there are expression types that are not expr<>, such as a type that uses extends<>. Is it possible that your considerable experience with expression templates is leading you to take too keen an interest in the proto::expr<> type? In Proto, it's largely a detail.
I think this is the beginning of a beautiful friendship :-)
Round up the usual suspects. :-) -- Eric Niebler Boost Consulting www.boost-consulting.com

AMDG Markus Werle wrote:
<snip>
Re: the _argN transforms ...
I assumed they were generated by a macro until BOOST_PROTO_MAX_ARITY ... I was preaching against that (inexistent?) macro. Now I took a look at proto_fwd.hpp and found they are made by hand and stop at 9. So now I really have a problem: what happens if I need an arity of 120?
Ha! I imaging you'll need longer than the heat death of the universe to compile your program,
If you have an arity grater than 120 you should be using sequence algorithms like fold. _arg1, _arg2, &c. work fine up to a certain point, but I would run screaming in terror if I saw a program that hard coded _arg412 e.g. struct EvilTransform : or_< when_<nary_expr<tag::plus, vararg<_> >, _make_plus(_arg231, _arg125) >, when_<nary_expr<tag::minus, vararg<_> >, _make_minus(_arg34, _arg371) >, otherwise<_make_modulus(_arg19, _arg98)> > {}; In Christ, Steven Watanabe
participants (3)
-
Eric Niebler
-
Markus Werle
-
Steven Watanabe