[phoenix] not playing nice with other libs

The following trivial program fails to compile: #include <boost/phoenix/core/limits.hpp> #include <boost/numeric/conversion/converter.hpp> It generates the following: In file included from ../../../../branches/release/boost/numeric/conversion/detail/is_subranged.hpp:17, from ../../../../branches/release/boost/numeric/conversion/detail/conversion_traits.hpp:21, from ../../../../branches/release/boost/numeric/conversion/conversion_traits.hpp:13, from ../../../../branches/release/boost/numeric/conversion/converter.hpp:13, from main.cpp:2: ../../../../branches/release/boost/mpl/multiplies.hpp:38: error: wrong number of template arguments (12, should be 5) ../../../../branches/release/boost/mpl/aux_/preprocessed/gcc/times.hpp:68: error: provided for ‘temp late<class N1, class N2, class N3, class N4, class N5> struct boost::mpl::times’ Phoenix is changing the following fundamental constants: BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them. -- Eric Niebler BoostPro Consulting www.boostpro.com

On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler@gmail.com> wrote:
The following trivial program fails to compile:
#include <boost/phoenix/core/limits.hpp> #include <boost/numeric/conversion/converter.hpp>
It generates the following:
In file included from ../../../../branches/release/boost/numeric/conversion/detail/is_subranged.hpp:17, from ../../../../branches/release/boost/numeric/conversion/detail/conversion_traits.hpp:21, from ../../../../branches/release/boost/numeric/conversion/conversion_traits.hpp:13, from ../../../../branches/release/boost/numeric/conversion/converter.hpp:13, from main.cpp:2: ../../../../branches/release/boost/mpl/multiplies.hpp:38: error: wrong number of template arguments (12, should be 5) ../../../../branches/release/boost/mpl/aux_/preprocessed/gcc/times.hpp:68: error: provided for ‘temp late<class N1, class N2, class N3, class N4, class N5> struct boost::mpl::times’
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it properly. Let me sketch why i changed these constants: 1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind The default BOOST_PROTO_MAX_ARITY is 5. The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case. I wonder what qualifies as "User". Phoenix is certainly a user of mpl, result_of and proto. Spirit is a user of proto and phoenix. Spirit needs an arity of 7 (IIRC). Anybody got any ideas? One idea that comes to my mind is having a phoenix::proto_expr, which is a proto::basic_expr, basically. Not sure if that would work though

On Mon, May 2, 2011 at 1:18 PM, Thomas Heller <thom.heller@googlemail.com> wrote:
On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler@gmail.com> wrote:
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it properly.
Let me sketch why i changed these constants: 1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
The default BOOST_PROTO_MAX_ARITY is 5.
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
I wonder what qualifies as "User". Phoenix is certainly a user of mpl, result_of and proto. Spirit is a user of proto and phoenix. Spirit needs an arity of 7 (IIRC).
Anybody got any ideas?
One idea that comes to my mind is having a phoenix::proto_expr, which is a proto::basic_expr, basically. Not sure if that would work though
Another solution could be to not define these constants in the accompanying C++ headers but define it in the Jamfiles for the different projects (if it needed to be changed). Boost.Build could then magically select the maximum and use it to build the current TU. However, this would require a major change, in probably any library, and the usage of boost headers wouldn't be as straight forward as it is now ...

Message du 02/05/11 13:19 De : "Thomas Heller" A : "Eric Niebler" Copie à : "Boost mailing list" Objet : Re: [boost] [phoenix] not playing nice with other libs
On Mon, May 2, 2011 at 12:54 PM, Eric Niebler wrote:
The following trivial program fails to compile:
#include #include
It generates the following:
In file included from ../../../../branches/release/boost/numeric/conversion/detail/is_subranged.hpp:17, from ../../../../branches/release/boost/numeric/conversion/detail/conversion_traits.hpp:21, from ../../../../branches/release/boost/numeric/conversion/conversion_traits.hpp:13, from ../../../../branches/release/boost/numeric/conversion/converter.hpp:13, from main.cpp:2: ../../../../branches/release/boost/mpl/multiplies.hpp:38: error: wrong number of template arguments (12, should be 5) ../../../../branches/release/boost/mpl/aux_/preprocessed/gcc/times.hpp:68: error: provided for ‘temp late struct boost::mpl::times’
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it properly.
Let me sketch why i changed these constants: 1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
The default BOOST_PROTO_MAX_ARITY is 5.
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
I wonder what qualifies as "User". Phoenix is certainly a user of mpl, result_of and proto. Spirit is a user of proto and phoenix. Spirit needs an arity of 7 (IIRC).
I agree. A library using another library having a max number of parameters must redefine it if the value is not enough big to its needs.
Anybody got any ideas?
Is there a way to avoid using the preprocessed MPL files (BOOST_MPL_PREPROCESSING_MODE?) Best, Vicente

AMDG On 05/02/2011 06:54 AM, Vicente BOTET wrote:
I wonder what qualifies as "User". Phoenix is certainly a user of mpl, result_of and proto. Spirit is a user of proto and phoenix. Spirit needs an arity of 7 (IIRC).
I agree. A library using another library having a max number of parameters must redefine it if the value is not enough big to its needs.
No. It's impossible to do this correctly. In Christ, Steven Watanabe

On Mon, May 2, 2011 at 3:54 PM, Vicente BOTET <vicente.botet@wanadoo.fr> wrote:
Message du 02/05/11 13:19 Anybody got any ideas? Is there a way to avoid using the preprocessed MPL files (BOOST_MPL_PREPROCESSING_MODE?)
Yes there is, unfortunately that mode seems to be a little buggy (aka couldn't get it to work). And i doubt it will solve the problem. The problem of interoperability will stay.

--- On Mon, 5/2/11, Thomas Heller wrote:
On Mon, May 2, 2011 at 12:54 PM, Eric Niebler wrote:
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it properly.
What I usually do instead of changing the values of existing constants is to start each header file with, for each fundamental constant used, what I'll dub a "constant guard": #if BOOST_PP_LESS(BOOST_PROTO_MAX_ARITY, SOME_MINIMUM_VALUE) #error Please set BOOST_PROTO_MAX_ARITY to SOME_MINIMUM_VALUE or higher #endif IOW, I'm just passing the buck to the library user or to the application developer. HTH, Cromwell D. Enage

On 02/05/2011 13:18, Thomas Heller wrote:
Let me sketch why i changed these constants: 1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
The default BOOST_PROTO_MAX_ARITY is 5.
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
What is in in Phoenix that intrinsically requires this? Is it just for the unit tests? Then manually set the right values just for the unit tests.

On Mon, May 2, 2011 at 2:31 PM, Mathias Gaunard < mathias.gaunard@ens-lyon.org> wrote:
On 02/05/2011 13:18, Thomas Heller wrote:
Let me sketch why i changed these constants:
1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
The default BOOST_PROTO_MAX_ARITY is 5.
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
What is in in Phoenix that intrinsically requires this?
Is it just for the unit tests? Then manually set the right values just for the unit tests.
I'm guessing it was probably a documented default setting in v2, hence I'm guessing this is an effort to maximize backward compatibility. I can think of 2 reasonable and safe things: - #error if the preprocessor constants from other libraries aren't satisfactory (as already suggested); or - define the default Boost.Phoenix preprocessor constants in terms of the other Boost library preprocessor constants, and warn about the backward incompatibility in the documentation. - Jeff

On 5/3/2011 10:36 AM, Jeffrey Lee Hellrung, Jr. wrote:
On Mon, May 2, 2011 at 2:31 PM, Mathias Gaunard< mathias.gaunard@ens-lyon.org> wrote:
On 02/05/2011 13:18, Thomas Heller wrote:
Let me sketch why i changed these constants:
1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
The default BOOST_PROTO_MAX_ARITY is 5.
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
What is in in Phoenix that intrinsically requires this?
Is it just for the unit tests? Then manually set the right values just for the unit tests.
I'm guessing it was probably a documented default setting in v2, hence I'm guessing this is an effort to maximize backward compatibility.
I can think of 2 reasonable and safe things: - #error if the preprocessor constants from other libraries aren't satisfactory (as already suggested); or - define the default Boost.Phoenix preprocessor constants in terms of the other Boost library preprocessor constants, and warn about the backward incompatibility in the documentation.
I'm with the first: #error if the preprocessor constants from other libraries aren't satisfactory (as already suggested); Regards, -- Joel de Guzman http://www.boostpro.com http://boost-spirit.com

On Tuesday, May 03, 2011 07:08:54 AM Joel de Guzman wrote:
On 5/3/2011 10:36 AM, Jeffrey Lee Hellrung, Jr. wrote:
On Mon, May 2, 2011 at 2:31 PM, Mathias Gaunard< mathias.gaunard@ens-lyon.org> wrote:
On 02/05/2011 13:18, Thomas Heller wrote:
Let me sketch why i changed these constants:
1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
The default BOOST_PROTO_MAX_ARITY is 5.
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
What is in in Phoenix that intrinsically requires this?
Is it just for the unit tests? Then manually set the right values just for the unit tests.
Its not just for the unit tests. Its for being API compatible with Boost.Bind and backwards compatible with Phoenix V2.
I'm guessing it was probably a documented default setting in v2, hence I'm guessing this is an effort to maximize backward compatibility.
I can think of 2 reasonable and safe things: - #error if the preprocessor constants from other libraries aren't satisfactory (as already suggested); or - define the default Boost.Phoenix preprocessor constants in terms of the other Boost library preprocessor constants, and warn about the backward incompatibility in the documentation.
I'm with the first: #error if the preprocessor constants from other libraries aren't satisfactory (as already suggested);
How should this work? Which library is going to emit the error? I needed to redefine these constants to make Phoenix work as it is. The "#error"s if the constants do not satisfy the needs are already there. Apparently that doesn't solve the problem. Working with the default LIMITs already predefined by default only shifts the problem. If a library needs to increase the limit the topic "[boost][library] not playing nice with other libs" will pop up ...

On 03/05/2011 07:36, Thomas Heller wrote:
Its not just for the unit tests. Its for being API compatible with Boost.Bind and backwards compatible with Phoenix V2.
How about making a file <boost/phoenix/compatibility_limits.hpp> that redefines the values necessary to be compatible with those APIs? Then whoever needs compatibility needs to include this file explicitly.

On 5/3/2011 9:16 PM, Mathias Gaunard wrote:
On 03/05/2011 07:36, Thomas Heller wrote:
Its not just for the unit tests. Its for being API compatible with Boost.Bind and backwards compatible with Phoenix V2.
How about making a file <boost/phoenix/compatibility_limits.hpp>
that redefines the values necessary to be compatible with those APIs?
Then whoever needs compatibility needs to include this file explicitly.
Very good suggestion. You may also have a PP define to auto-include it: #define BOOST_PHOENIX_V2_COMPATIBILITY Regards, -- Joel de Guzman http://www.boostpro.com http://boost-spirit.com

On Mon, May 2, 2011 at 10:36 PM, Thomas Heller <thom.heller@googlemail.com>wrote:
On Tuesday, May 03, 2011 07:08:54 AM Joel de Guzman wrote:
On 5/3/2011 10:36 AM, Jeffrey Lee Hellrung, Jr. wrote:
[...]
I can think of 2 reasonable and safe things: - #error if the preprocessor constants from other libraries aren't satisfactory (as already suggested); or - define the default Boost.Phoenix preprocessor constants in terms of the other Boost library preprocessor constants, and warn about the backward incompatibility in the documentation.
I'm with the first: #error if the preprocessor constants from other libraries aren't satisfactory (as already suggested);
How should this work? Which library is going to emit the error?
Boost.Phoenix.
I needed to redefine these constants to make Phoenix work as it is.
I don't think it is a viable solution for Phoenix to define/redefine the preprocessor constants of other Boost libraries (as Steven pointed out). Mathias suggested having a dedicated header for this, presumably to be #include'd before any other headers in a cpp, but I don't know if this is a precedent that should be set. I can imagine quite a dependency mess if other libraries start following this same practice.
The "#error"s if the constants do not satisfy the needs are already there. Apparently that doesn't solve the problem.
How doesn't it solve the problem? I would assume you are #include'ing the appropriate MPL, Fusion, etc. headers prior to checking these constants. Working with the default LIMITs already predefined by default only shifts
the problem.
...to the user. An inconvenience, but better than the current solution, I think...
If a library needs to increase the limit the topic "[boost][library] not playing nice with other libs" will pop up ...
Correction: If a library attempts to increase the limit ... (as Boost.Phoenix currently attempts to do) Ideally, the #error message would indicate what the user needs to do to get everything working in harmony. I think, in this case, so that Boost.Phoenix works out of the box, we might have to a accept a breaking change in the default preprocessor constants... - Jeff

Jeffrey Lee Hellrung, Jr. wrote:
I'm guessing it was probably a documented default setting in v2, hence I'm guessing this is an effort to maximize backward compatibility.
I can think of 2 reasonable and safe things: - #error if the preprocessor constants from other libraries aren't satisfactory (as already suggested); or
Sounds a bit inconvenient for the user, but maybe I just misunderstood.
- define the default Boost.Phoenix preprocessor constants in terms of the other Boost library preprocessor constants, and warn about the backward incompatibility in the documentation.
Sounds fine to me, but that doesn't mean much. It depends a bit on how easy the user can fix this himself when he is aware of it, and the impact on compile time. Regards, Thomas

(cross-posting to the Proto list and cc'ing Hartmut.) On 5/2/2011 6:18 PM, Thomas Heller wrote:
On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler@gmail.com> wrote:
The following trivial program fails to compile:
#include <boost/phoenix/core/limits.hpp> #include <boost/numeric/conversion/converter.hpp>
It generates the following:
In file included from ../../../../branches/release/boost/numeric/conversion/detail/is_subranged.hpp:17, from ../../../../branches/release/boost/numeric/conversion/detail/conversion_traits.hpp:21, from ../../../../branches/release/boost/numeric/conversion/conversion_traits.hpp:13, from ../../../../branches/release/boost/numeric/conversion/converter.hpp:13, from main.cpp:2: ../../../../branches/release/boost/mpl/multiplies.hpp:38: error: wrong number of template arguments (12, should be 5) ../../../../branches/release/boost/mpl/aux_/preprocessed/gcc/times.hpp:68: error: provided for ‘temp late<class N1, class N2, class N3, class N4, class N5> struct boost::mpl::times’
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it properly.
Let me sketch why i changed these constants: 1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
It's still not clear to me why you're changing BOOST_MPL_LIMIT_METAFUNCTION_ARITY and BOOST_PROTO_MAX_LOGICAL_ARITY.
The default BOOST_PROTO_MAX_ARITY is 5.
I see. So this is inherently a limitation in Proto. I set Proto's max arity to 5 because more than that causes compile time issues. That's because there are N*M proto::expr::operator() overloads, where N is Proto's max arity and M is Proto's max function call arity. However: - IIRC, Phoenix doesn't use proto::expr. It uses proto::basic_expr, a lighter weight expression container that has no member operator overloads. - Compile time could be improved by pre-preprocessing, like MPL. That's something I've been meaning to do anyway. - The max function-call arity can already be set separately from the max number of child expressions. - The compile-time problem is a temporary one. Once more compilers have support for variadic templates, all the operator() overloads can be replaced with just one variadic one. Which should be done anyway. The solution then is in some combination of (a) allowing basic_expr to have a greater number of child expressions than expr, (b) bumping the max arity while leaving the max function call arity alone, (c) pre-preprocessing, (d) adding a variadic operator() for compilers that support it, and (e) just living with worse compile times until compilers catch up with C++0x. Not sure where the sweet spot is, but I'm pretty sure there is some way we can get Proto to support 10 child expressions for Phoenix's usage scenario. It'll take some work on my end though. Help would be appreciated.
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
What workaround did you have in mind?
I wonder what qualifies as "User". Phoenix is certainly a user of mpl, result_of and proto. Spirit is a user of proto and phoenix. Spirit needs an arity of 7 (IIRC).
By "user" I meant "end-user" ... a user of Boost. You have to consider that someone may want to use Phoenix and MPL and Numeric and ... all in the same translation unit. We shouldn't make that hard. This proliferation of interdependent constants is a maintenance nightmare. I tend to agree with Jeff Hellrung who said that Phoenix should make do with the defaults and document any backwards incompatibilities and how to fix them. But we should make every effort such that the defaults Just Work.
Anybody got any ideas?
One idea that comes to my mind is having a phoenix::proto_expr, which is a proto::basic_expr, basically. Not sure if that would work though
I don't like special-casing for Phoenix. Other libraries have the same problem. Hartmut, you have done some work on a Wave-based tool to help with the pre-preprocessing grunt-work, is that right? -- Eric Niebler BoostPro Computing http://www.boostpro.com

On Wed, May 4, 2011 at 10:58 AM, Eric Niebler <eric@boostpro.com> wrote:
(cross-posting to the Proto list and cc'ing Hartmut.) On 5/2/2011 6:18 PM, Thomas Heller wrote:
On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler@gmail.com> wrote: <snip>
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it properly.
Let me sketch why i changed these constants: 1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
It's still not clear to me why you're changing BOOST_MPL_LIMIT_METAFUNCTION_ARITY and BOOST_PROTO_MAX_LOGICAL_ARITY.
I don't remember the exact reasons anymore ... just checked the proto code again ... Seems like there have been some changes regarding these macros. At the time i wrote the code for these macro redefintions, it was necessary to make phoenix compile.
The default BOOST_PROTO_MAX_ARITY is 5.
I see. So this is inherently a limitation in Proto. I set Proto's max arity to 5 because more than that causes compile time issues. That's because there are N*M proto::expr::operator() overloads, where N is Proto's max arity and M is Proto's max function call arity. However:
- IIRC, Phoenix doesn't use proto::expr. It uses proto::basic_expr, a lighter weight expression container that has no member operator overloads.
Correct. But we also need the arity in: proto::call, proto::or_ and maybe some others
- Compile time could be improved by pre-preprocessing, like MPL. That's something I've been meaning to do anyway.
Yes, we (Hartmut and me) keep saying for quite some time now.
- The max function-call arity can already be set separately from the max number of child expressions.
- The compile-time problem is a temporary one. Once more compilers have support for variadic templates, all the operator() overloads can be replaced with just one variadic one. Which should be done anyway.
Right.
The solution then is in some combination of (a) allowing basic_expr to have a greater number of child expressions than expr, (b) bumping the max arity while leaving the max function call arity alone, (c) pre-preprocessing, (d) adding a variadic operator() for compilers that support it, and (e) just living with worse compile times until compilers catch up with C++0x.
Not sure where the sweet spot is, but I'm pretty sure there is some way we can get Proto to support 10 child expressions for Phoenix's usage scenario. It'll take some work on my end though. Help would be appreciated.
Yes, I was thinking of possible solutions: 1) splittling the expressions in half, something like this: proto::basic_expr< tag , proto::basic_expr< sub_tag , Child0, ..., Child(BOOST_PROTO_MAX_ARITY) > , proto::basic_expr< sub_tag , Child(BOOST_PROTO_MAX_ARITY), ... Child(BOOST_PROTO_MAX_ARITY * 2) >
This would only need some additional work on the phoenix side. Not sure if its actually worth it ... or even working. 2) Have some kind of completely variadic proto expression. Not by having variadic templates but by creating the list of children by some kind of cons list. This might requires a quite substantial change in proto, haven't fully investigated that option.
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
What workaround did you have in mind?
Calling F::template result<...> directly, basically reimplementing result_of for our phoenix' own limits.
I wonder what qualifies as "User". Phoenix is certainly a user of mpl, result_of and proto. Spirit is a user of proto and phoenix. Spirit needs an arity of 7 (IIRC).
By "user" I meant "end-user" ... a user of Boost. You have to consider that someone may want to use Phoenix and MPL and Numeric and ... all in the same translation unit. We shouldn't make that hard. This proliferation of interdependent constants is a maintenance nightmare.
I agree. I don't think there really is a general solution to that. There have been reports by Micheal Caisse of some macro definition nightmare while using MSM together with spirit. If i remember the details correctly, MSM changes the proto constants as well. This problem is not really phoenix specific!
I tend to agree with Jeff Hellrung who said that Phoenix should make do with the defaults and document any backwards incompatibilities and how to fix them. But we should make every effort such that the defaults Just Work.
I agree. One simple solution is to add a big fat warning to the docs saying to include phoenix/core/limits.hpp before anything else. However, that will not solve your reported problem.
Anybody got any ideas?
One idea that comes to my mind is having a phoenix::proto_expr, which is a proto::basic_expr, basically. Not sure if that would work though
I don't like special-casing for Phoenix. Other libraries have the same problem.
Hartmut, you have done some work on a Wave-based tool to help with the pre-preprocessing grunt-work, is that right?
Yes. Hartmut implemented partial preprocessing for phoenix using wave. As an example on how to use it see this file: http://svn.boost.org/svn/boost/trunk/boost/phoenix/object/detail/new.hpp To preprocess phoenix call: wave -o- -DPHOENIX_LIMIT=10 libs/phoenix/preprocess/preprocess_phoenix.cpp You need to have a file called wave.cfg in your current directory. An example configuration can be found at: http://svn.boost.org/svn/boost/trunk/libs/phoenix/preprocess/wave.cfg The reason we hold this solution back is because we think there should be some more generic process to invoke the wave preprocessing. Probably through bjam. The critical points here are where to find the system include files.

On 5/4/2011 6:25 PM, Thomas Heller wrote:
On Wed, May 4, 2011 at 10:58 AM, Eric Niebler <eric@boostpro.com> wrote:
On 5/2/2011 6:18 PM, Thomas Heller wrote:
The default BOOST_PROTO_MAX_ARITY is 5.
I see. So this is inherently a limitation in Proto. I set Proto's max arity to 5 because more than that causes compile time issues. That's because there are N*M proto::expr::operator() overloads, where N is Proto's max arity and M is Proto's max function call arity. However:
- IIRC, Phoenix doesn't use proto::expr. It uses proto::basic_expr, a lighter weight expression container that has no member operator overloads.
Correct. But we also need the arity in: proto::call, proto::or_ and maybe some others
I'd like more details here, please. You never really *need* to increase BOOST_PROTO_MAX_LOGICAL_ARITY because you can nest multiple proto::or_'s and proto::and_'s. And if you need that many, you might think about refactoring your grammar. Proto::or_ can be more efficiently rewritten as proto::switch_, for instance. <snip>
The solution then is in some combination of (a) allowing basic_expr to have a greater number of child expressions than expr, (b) bumping the max arity while leaving the max function call arity alone, (c) pre-preprocessing, (d) adding a variadic operator() for compilers that support it, and (e) just living with worse compile times until compilers catch up with C++0x.
Not sure where the sweet spot is, but I'm pretty sure there is some way we can get Proto to support 10 child expressions for Phoenix's usage scenario. It'll take some work on my end though. Help would be appreciated.
Yes, I was thinking of possible solutions: 1) splittling the expressions in half, something like this: proto::basic_expr< tag , proto::basic_expr< sub_tag , Child0, ..., Child(BOOST_PROTO_MAX_ARITY) > , proto::basic_expr< sub_tag , Child(BOOST_PROTO_MAX_ARITY), ... Child(BOOST_PROTO_MAX_ARITY * 2) >
This would only need some additional work on the phoenix side. Not sure if its actually worth it ... or even working.
Not this. It's like that early prototype of Phoenix where every expression was a terminal and the value was a Fusion sequence of other Proto expressions. You can't use Proto's transforms to manipulate such beasts. Admittedly, Proto is rather inflexible when it comes to how children are stored. My excuse is that I do it to bring down compile times.
2) Have some kind of completely variadic proto expression. Not by having variadic templates but by creating the list of children by some kind of cons list. This might requires a quite substantial change in proto, haven't fully investigated that option.
You would go from instantiating 1 template per node to instantiating N templates, where N is the number of child nodes. This is then multiplied by the number of nodes in an expression tree. Not good.
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
What workaround did you have in mind?
Calling F::template result<...> directly, basically reimplementing result_of for our phoenix' own limits.
As an implementation detail? Sure, no problem.
I wonder what qualifies as "User". Phoenix is certainly a user of mpl, result_of and proto. Spirit is a user of proto and phoenix. Spirit needs an arity of 7 (IIRC).
By "user" I meant "end-user" ... a user of Boost. You have to consider that someone may want to use Phoenix and MPL and Numeric and ... all in the same translation unit. We shouldn't make that hard. This proliferation of interdependent constants is a maintenance nightmare.
I agree. I don't think there really is a general solution to that. There have been reports by Micheal Caisse of some macro definition nightmare while using MSM together with spirit. If i remember the details correctly, MSM changes the proto constants as well. This problem is not really phoenix specific!
Oh, yeah. MSM changes Proto's max arity to be 7. OK, I can see that 5 is too low for folks. Proto needs some work.
I tend to agree with Jeff Hellrung who said that Phoenix should make do with the defaults and document any backwards incompatibilities and how to fix them. But we should make every effort such that the defaults Just Work.
I agree. One simple solution is to add a big fat warning to the docs saying to include phoenix/core/limits.hpp before anything else. However, that will not solve your reported problem.
Right, that's not what I had in mind. The warning would be like, "By default, phoenix::bind only accepts up to X arguments, whereas boost::bind accepts 10. If you want phoenix::bind to accept 10 also, you have to do XYZ." Ditto for any incompatibilities with Phoenix v2.
Anybody got any ideas?
One idea that comes to my mind is having a phoenix::proto_expr, which is a proto::basic_expr, basically. Not sure if that would work though
I don't like special-casing for Phoenix. Other libraries have the same problem.
Hartmut, you have done some work on a Wave-based tool to help with the pre-preprocessing grunt-work, is that right?
Yes. Hartmut implemented partial preprocessing for phoenix using wave. As an example on how to use it see this file: http://svn.boost.org/svn/boost/trunk/boost/phoenix/object/detail/new.hpp
To preprocess phoenix call: wave -o- -DPHOENIX_LIMIT=10 libs/phoenix/preprocess/preprocess_phoenix.cpp
You need to have a file called wave.cfg in your current directory. An example configuration can be found at: http://svn.boost.org/svn/boost/trunk/libs/phoenix/preprocess/wave.cfg
OK, I'll dig into this, hopefully this weekend. FYI, I've committed a change that makes proto::expr::operator() use variadics when they're available. With any luck, pre-preprocessing will get us to the point where I can just bump Proto's max arity to 10 and not suffer any degradation in compile times. I'll also need to investigate why Proto depends on BOOST_MPL_LIMIT_METAFUNCTION_ARITY. -- Eric Niebler BoostPro Computing http://www.boostpro.com

On Wed, May 4, 2011 at 1:32 PM, Eric Niebler <eric@boostpro.com> wrote:
On 5/4/2011 6:25 PM, Thomas Heller wrote:
On Wed, May 4, 2011 at 10:58 AM, Eric Niebler <eric@boostpro.com> wrote:
On 5/2/2011 6:18 PM, Thomas Heller wrote:
The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i needed to provide 11 arguments in a "call" to boost::result_of. But i guess a workaround can be found in this specific case.
What workaround did you have in mind?
Calling F::template result<...> directly, basically reimplementing result_of for our phoenix' own limits.
As an implementation detail? Sure, no problem.
If you guys like, I could up the default BOOST_RESULT_OF_NUM_ARGS to, say, 15. I don't anticipate that it would cause any problems. Daniel Walker

On 5/5/2011 2:54 AM, Daniel Walker wrote:
If you guys like, I could up the default BOOST_RESULT_OF_NUM_ARGS to, say, 15. I don't anticipate that it would cause any problems.
Hey, that'd be swell. Not sure why this solution didn't occur to me. Looks like for the purpose of Phoenix, you'd only need to bump it by 1 to 11. I'm not sure why 11 instead of 10. Thomas? -- Eric Niebler BoostPro Computing http://www.boostpro.com

On Wed, May 4, 2011 at 11:41 PM, Eric Niebler <eric@boostpro.com> wrote:
On 5/5/2011 2:54 AM, Daniel Walker wrote:
If you guys like, I could up the default BOOST_RESULT_OF_NUM_ARGS to, say, 15. I don't anticipate that it would cause any problems.
Hey, that'd be swell. Not sure why this solution didn't occur to me. Looks like for the purpose of Phoenix, you'd only need to bump it by 1 to 11. I'm not sure why 11 instead of 10. Thomas?
OK, done. I bumped it up to 16 just for good measure, so there'll be plenty of elbow room in the future. Daniel Walker

On 06/05/2011 22:06, Daniel Walker wrote:
On Wed, May 4, 2011 at 11:41 PM, Eric Niebler<eric@boostpro.com> wrote:
On 5/5/2011 2:54 AM, Daniel Walker wrote:
If you guys like, I could up the default BOOST_RESULT_OF_NUM_ARGS to, say, 15. I don't anticipate that it would cause any problems.
Hey, that'd be swell. Not sure why this solution didn't occur to me. Looks like for the purpose of Phoenix, you'd only need to bump it by 1 to 11. I'm not sure why 11 instead of 10. Thomas?
OK, done. I bumped it up to 16 just for good measure, so there'll be plenty of elbow room in the future.
Did you check it didn't increase compile times significantly? boost::result_of is used in a lot of places.

On Sat, May 7, 2011 at 5:48 AM, Mathias Gaunard <mathias.gaunard@ens-lyon.org> wrote:
On 06/05/2011 22:06, Daniel Walker wrote:
On Wed, May 4, 2011 at 11:41 PM, Eric Niebler<eric@boostpro.com> wrote:
On 5/5/2011 2:54 AM, Daniel Walker wrote:
If you guys like, I could up the default BOOST_RESULT_OF_NUM_ARGS to, say, 15. I don't anticipate that it would cause any problems.
Hey, that'd be swell. Not sure why this solution didn't occur to me. Looks like for the purpose of Phoenix, you'd only need to bump it by 1 to 11. I'm not sure why 11 instead of 10. Thomas?
OK, done. I bumped it up to 16 just for good measure, so there'll be plenty of elbow room in the future.
Did you check it didn't increase compile times significantly? boost::result_of is used in a lot of places.
I just ran a some tests and didn't notice any increase in compile time. Daniel Walker

On Wed, May 4, 2011 at 10:58 AM, Eric Niebler <eric@boostpro.com> wrote:
(cross-posting to the Proto list and cc'ing Hartmut.)
On 5/2/2011 6:18 PM, Thomas Heller wrote:
On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler@gmail.com> wrote:
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it properly.
Let me sketch why i changed these constants: 1) Phoenix V2 has a composite limit of 10: This is equivalent to the number of child expressions a expression can hold. This is controlled by BOOST_PROTO_MAX_ARITY for the number of template arguments for proto::expr and proto::basic_expr 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
It's still not clear to me why you're changing BOOST_MPL_LIMIT_METAFUNCTION_ARITY and BOOST_PROTO_MAX_LOGICAL_ARITY.
BOOST_MPL_LIMIT_METAFUNCTION_ARITY: ./boost/proto/matches.hpp:56:8: error: #error BOOST_MPL_LIMIT_METAFUNCTION_ARITY must be at least as large as BOOST_PROTO_MAX_ARITY and BOOST_PROTO_MAX_LOGICAL_ARITY: ./boost/proto/proto_fwd.hpp:326: error: provided for ‘template<class G0, class G1, class G2, class G3, class G4, class G5, class G6, class G7> struct boost::proto::and_’ But i guess this can be fixed by simply splitting the call to proto::and_ ....

Hartmut, you have done some work on a Wave-based tool to help with the pre-preprocessing grunt-work, is that right?
I trust Thomas has answered that already. Just ask if you need more information. Regards Hartmut --------------- http://boost-spirit.com

On 5/5/2011 9:36 PM, Hartmut Kaiser wrote:
Hartmut, you have done some work on a Wave-based tool to help with the pre-preprocessing grunt-work, is that right?
I trust Thomas has answered that already. Just ask if you need more information.
Thanks, Hartmut. It works beautifully. One question: what if I don't want the generated files to be *completely* preprocessed? As a trivial example, what if I wanted to leave in simple compiler work-arounds like BOOST_STATIC_CONSTANT and BOOST_DEDUCED_TYPENAME? -- Eric Niebler BoostPro Computing http://www.boostpro.com

On 5/5/2011 9:36 PM, Hartmut Kaiser wrote:
Hartmut, you have done some work on a Wave-based tool to help with the pre-preprocessing grunt-work, is that right?
I trust Thomas has answered that already. Just ask if you need more information.
Thanks, Hartmut. It works beautifully.
One question: what if I don't want the generated files to be *completely* preprocessed? As a trivial example, what if I wanted to leave in simple compiler work-arounds like BOOST_STATIC_CONSTANT and BOOST_DEDUCED_TYPENAME?
Short answer: generally no. Long answer: if we know what we're doing, and we're sure that those macros do not influence any #ifdef's later on, we could suppress the expansion of a predefined list of macros. This is easily possible with the Wave library, but it is not exposed by the Wave driver program. Let me see what I can do to add a corresponding command line option allowing you to tell the Wave driver not to expand a certain macro. Regards Hartmut --------------- http://boost-spirit.com

On 5/5/2011 9:36 PM, Hartmut Kaiser wrote:
Hartmut, you have done some work on a Wave-based tool to help with the pre-preprocessing grunt-work, is that right?
I trust Thomas has answered that already. Just ask if you need more information.
Thanks, Hartmut. It works beautifully.
One question: what if I don't want the generated files to be *completely* preprocessed? As a trivial example, what if I wanted to leave in simple compiler work-arounds like BOOST_STATIC_CONSTANT and BOOST_DEDUCED_TYPENAME?
Short answer: generally no.
Long answer: if we know what we're doing, and we're sure that those macros do not influence any #ifdef's later on, we could suppress the expansion of a predefined list of macros. This is easily possible with the Wave library, but it is not exposed by the Wave driver program. Let me see what I can do to add a corresponding command line option allowing you to tell the Wave driver not to expand a certain macro.
Ok, that was easier than I thought it would be. I added a new command line option --noexpand/-N to Wave allowing to specify a macro name which will not be expanded in any case. Its invocation will be copied verbatim to the generated output, just like if it was not defined in the first place. Function like macros will be copied verbatim with all of their arguments. No macro expansion will be done for any of the arguments in that case, even if the argument list contains macros which would be expanded normally. Use more than one -N for more macro names. Be careful, though: this not only suppresses the expansion of the macro, it removes the macro from any consideration for Wave itself. Two problems can be caused by this: a) If you suppress macros which are part of an #if expression you'll likely to see preprocessing errors. b) If the suppressed macro influences the evaluation of #if expressions down the road, you might get surprises as well. Overall it should be safe to suppress macros which are used outside of other preprocessor constructs, like BOOST_STATIC_ASSERT(...) et.al. Regards Hartmut --------------- http://boost-spirit.com

On 5/7/2011 10:22 PM, Hartmut Kaiser wrote:
Ok, that was easier than I thought it would be. I added a new command line option --noexpand/-N to Wave allowing to specify a macro name which will not be expanded in any case. Its invocation will be copied verbatim to the generated output, just like if it was not defined in the first place. <snip>
Works like a charm. Thanks, Hartmut. -- Eric Niebler BoostPro Computing http://www.boostpro.com

On 5/7/2011 10:22 PM, Hartmut Kaiser wrote:
Function like macros will be copied verbatim with all of their arguments. No macro expansion will be done for any of the arguments in that case, even if the argument list contains macros which would be expanded normally.
Oh, and since I needed macro arguments to be expanded, but the macro itself not to be, I found a simple work-around that you might want to document. I rewrote this: #define FOO(X) X to be: #define FOO2(X) X #define FOO FOO2 Now I can pass -NFOO -DN=1 to Wave, and FOO(N) will be partially expanded to FOO(1). :-) -- Eric Niebler BoostPro Computing http://www.boostpro.com

On 5/2/2011 6:18 PM, Thomas Heller wrote:
On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler@gmail.com> wrote:
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it properly. <snip>
The Proto pre-preprocessing work on trunk has progressed to the point where compiling with all the arities at 10 now compiles *faster* than unpreprocessed Proto with the arities at 5. So I've bumped everything to 10. A few things: 1) Phoenix is now broken. My Proto work involved pruning some unnecessary headers, and Phoenix isn't including everything it needs. Thomas, I'll leave this for you to fix. 2) Phoenix is actually setting Proto's max arity to 11, not to 10. I think this is unnecessary. Locally, I un-broke Phoenix and ran its tests with 10, and only one test broke. That was due to a bug in Phoenix. I'm attaching a patch for that. 3) After the patch is applied, Phoenix should be changed such that it includes proto_fwd.hpp and then acts accordingly based on the values of the constants. IMO, that should mean graceful degradation of behavior with lower arities, until a point such that Phoenix cannot function at all, in which case it should #error out. 4) Phoenix no longer needs to change BOOST_MPL_LIMIT_METAFUNCTION_ARITY and BOOST_RESULT_OF_NUM_ARGS, but BOOST_RESULT_OF_NUM_ARGS should be given the same treatment as (3). 5) MSM do the same. My pre-preprocessing work continues, and all EDSLs that use Proto will benefit from faster compiles. I'd like to thank Hartmut for his work on Wave and Thomas for getting me set up. -- Eric Niebler BoostPro Computing http://www.boostpro.com

On Sunday, May 08, 2011 02:05:43 PM Eric Niebler wrote:
On 5/2/2011 6:18 PM, Thomas Heller wrote:
On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler@gmail.com> wrote:
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it properly. <snip>
The Proto pre-preprocessing work on trunk has progressed to the point where compiling with all the arities at 10 now compiles *faster* than unpreprocessed Proto with the arities at 5. So I've bumped everything to 10.
A few things:
1) Phoenix is now broken. My Proto work involved pruning some unnecessary headers, and Phoenix isn't including everything it needs. Thomas, I'll leave this for you to fix.
2) Phoenix is actually setting Proto's max arity to 11, not to 10. I think this is unnecessary. Locally, I un-broke Phoenix and ran its tests with 10, and only one test broke. That was due to a bug in Phoenix. I'm attaching a patch for that.
3) After the patch is applied, Phoenix should be changed such that it includes proto_fwd.hpp and then acts accordingly based on the values of the constants. IMO, that should mean graceful degradation of behavior with lower arities, until a point such that Phoenix cannot function at all, in which case it should #error out.
4) Phoenix no longer needs to change BOOST_MPL_LIMIT_METAFUNCTION_ARITY and BOOST_RESULT_OF_NUM_ARGS, but BOOST_RESULT_OF_NUM_ARGS should be given the same treatment as (3).
5) MSM do the same.
My pre-preprocessing work continues, and all EDSLs that use Proto will benefit from faster compiles. I'd like to thank Hartmut for his work on Wave and Thomas for getting me set up.
Eric, thanks for doing the dirty work here! This is awesome! With the latest changes, Phoenix V3 now compiles than V2! Wee! Just about to fix everything ...

On Sunday, May 08, 2011 02:05:43 PM Eric Niebler wrote:
On 5/2/2011 6:18 PM, Thomas Heller wrote:
On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler@gmail.com> wrote:
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it
On Sunday, May 08, 2011 02:48:19 PM Thomas Heller wrote: properly.
<snip>
The Proto pre-preprocessing work on trunk has progressed to the point where compiling with all the arities at 10 now compiles *faster* than unpreprocessed Proto with the arities at 5. So I've bumped everything to 10.
Phoenix is up and running now again and should play nice with other libs! Eric, thanks for pointing it all out and actively working on a fix for this!

On 5/8/2011 9:54 PM, Thomas Heller wrote:
Phoenix is up and running now again and should play nice with other libs!
Proto is now fully preprocessed on trunk, and I've just merged it all to release. That means that the corresponding changes to Phoenix and MSM also should be merged now also. Christophe, there's a good chance that you won't see the promised compile-time improvements from this. Compile times for xpressive didn't budge. I think it's because xpressive's compile times are determined primarily by the large number of templates it instantiates, which swamps the PP time. I'm guessing MSM is the same. Oh, well. At least we've fixed the mess with the predefined limits. -- Eric Niebler BoostPro Computing http://www.boostpro.com

2011/5/9 Eric Niebler <eric@boostpro.com>:
On 5/8/2011 9:54 PM, Thomas Heller wrote:
Phoenix is up and running now again and should play nice with other libs!
Proto is now fully preprocessed on trunk, and I've just merged it all to release. That means that the corresponding changes to Phoenix and MSM also should be merged now also.
Christophe, there's a good chance that you won't see the promised compile-time improvements from this. Compile times for xpressive didn't budge. I think it's because xpressive's compile times are determined primarily by the large number of templates it instantiates, which swamps the PP time. I'm guessing MSM is the same. Oh, well. At least we've fixed the mess with the predefined limits.
You're right, there is no change in compile-time, which is still ok for me because I got rid of the limits at no cost. So it's still a noticeable gain for me. Thanks a lot! Christophe

On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler@gmail.com> wrote:
Phoenix is changing the following fundamental constants:
BOOST_PROTO_MAX_ARITY BOOST_MPL_LIMIT_METAFUNCTION_ARITY BOOST_PROTO_MAX_LOGICAL_ARITY BOOST_RESULT_OF_NUM_ARGS
IMO, Phoenix shouldn't be touching these. It should work as best it can with the default values. Users who are so inclined can change them.
Eric, This problem is well known. As of now I have no clue how to fix it
On 5/2/2011 6:18 PM, Thomas Heller wrote: properly. <snip>
The Proto pre-preprocessing work on trunk has progressed to the point where compiling with all the arities at 10 now compiles *faster* than unpreprocessed Proto with the arities at 5. So I've bumped everything to 10.
A few things:
1) Phoenix is now broken. My Proto work involved pruning some unnecessary headers, and Phoenix isn't including everything it needs. Thomas, I'll leave this for you to fix.
2) Phoenix is actually setting Proto's max arity to 11, not to 10. I think this is unnecessary. Locally, I un-broke Phoenix and ran its tests with 10, and only one test broke. That was due to a bug in Phoenix. I'm attaching a patch for that.
3) After the patch is applied, Phoenix should be changed such that it includes proto_fwd.hpp and then acts accordingly based on the values of the constants. IMO, that should mean graceful degradation of behavior with lower arities, until a point such that Phoenix cannot function at all, in which case it should #error out.
4) Phoenix no longer needs to change BOOST_MPL_LIMIT_METAFUNCTION_ARITY and BOOST_RESULT_OF_NUM_ARGS, but BOOST_RESULT_OF_NUM_ARGS should be given the same treatment as (3).
5) MSM do the same.
My pre-preprocessing work continues, and all EDSLs that use Proto will benefit from faster compiles. I'd like to thank Hartmut for his work on Wave and Thomas for getting me set up.
I'm glad you did the changes because this improves compile times for everybody using Proto! This will further lessen the burden for people currently scared off. Just a question: what's your rationale of limiting the generated pp headers to an arity of 10? MPL and Phoenix have it set up for higher arities as well (as you probably know). Regards Hartmut --------------- http://boost-spirit.com

On 5/8/2011 11:08 PM, Hartmut Kaiser wrote:
Just a question: what's your rationale of limiting the generated pp headers to an arity of 10? MPL and Phoenix have it set up for higher arities as well (as you probably know).
Phoenix doesn't have it set higher. Or, it did, but it was a bug. Perhaps you meant Fusion. Yes, it's higher for Fusion and MPL. The reason for 10 and not something higher (yet) is because there is N^2 overloads of expr::operator() on compilers that don't support variadic templates. And with BLL and Bind and Phoenix, there's a history of supporting arities up to 10 and no more. I'm balancing keeping it fast and light(-ish) and making it useful in the real world. -- Eric Niebler BoostPro Computing http://www.boostpro.com

Just a question: what's your rationale of limiting the generated pp
MPL and Phoenix have it set up for higher arities as well (as you
On 5/8/2011 11:08 PM, Hartmut Kaiser wrote: headers to an arity of 10? probably know).
Phoenix doesn't have it set higher. Or, it did, but it was a bug. Perhaps you meant Fusion. Yes, it's higher for Fusion and MPL. The reason for 10 and not something higher (yet) is because there is N^2 overloads of expr::operator() on compilers that don't support variadic templates. And with BLL and Bind and Phoenix, there's a history of supporting arities up to 10 and no more. I'm balancing keeping it fast and light(-ish) and making it useful in the real world.
Hmmm, maybe I misunderstand the situation, but MPL and Phoenix have 5 different versions of preprocessed headers which will be used depending on the LIMITs specified by the user. No unnecessary overhead is created this way. For any LIMIT <= 10 Phoenix uses one set of pp files, for LIMITs <= 20 the next set, etc. This surely creates some additional burden for the author as you have to run wave 5 times, but that's it. Regards Hartmut --------------- http://boost-spirit.com

On 5/8/2011 11:36 PM, Hartmut Kaiser wrote:
Hmmm, maybe I misunderstand the situation, but MPL and Phoenix have 5 different versions of preprocessed headers which will be used depending on the LIMITs specified by the user. No unnecessary overhead is created this way. For any LIMIT <= 10 Phoenix uses one set of pp files, for LIMITs <= 20 the next set, etc.
This surely creates some additional burden for the author as you have to run wave 5 times, but that's it.
Ah, I see what you mean. Yes, I could do that. I'll add it to the list. -- Eric Niebler BoostPro Computing http://www.boostpro.com

My pre-preprocessing work continues, and all EDSLs that use Proto will benefit from faster compiles. I'd like to thank Hartmut for his work on Wave and Thomas for getting me set up.
First results: Spirit/Phoenix V2, MSVC, speedup ~8% (no opt), ~6% (opt) Spirit/Phoenix V3, MSVC, speedup ~5% (no opt), ~4% (opt), includes full preprocessed headers for Phoenix Thanks! Regards Hartmut --------------- http://boost-spirit.com

On Sunday, May 08, 2011 07:36:32 PM Hartmut Kaiser wrote:
My pre-preprocessing work continues, and all EDSLs that use Proto will benefit from faster compiles. I'd like to thank Hartmut for his work on Wave and Thomas for getting me set up.
First results:
Spirit/Phoenix V2, MSVC, speedup ~8% (no opt), ~6% (opt) Spirit/Phoenix V3, MSVC, speedup ~5% (no opt), ~4% (opt), includes full
preprocessed headers for Phoenix Just to give you an idea of the impact of the recent events: g++ -I. libs/phoenix/test/core/primitives_tests.cpp -c real 0m1.525s user 0m1.370s sys 0m0.137s ---------------------------------------------------------------------------- g++ -I. libs/phoenix/test/core/primitives_tests.cpp -c - DBOOST_PROTO_DONT_USE_PREPROCESSED_FILES - DBOOST_PHOENIX_DONT_USE_PREPROCESSED_FILES real 0m2.626s user 0m2.463s sys 0m0.157s ---------------------------------------------------------------------------- g++ -I. libs/phoenix/test/core/primitives_tests.cpp -c - DBOOST_PROTO_DONT_USE_PREPROCESSED_FILES real 0m2.414s user 0m2.283s sys 0m0.123s ---------------------------------------------------------------------------- g++ -I. libs/phoenix/test/core/primitives_tests.cpp -c - DBOOST_PHOENIX_DONT_USE_PREPROCESSED_FILES real 0m1.729s user 0m1.607s sys 0m0.117s ---------------------------------------------------------------------------- g++ -I. libs/spirit/phoenix/test/core/primitives_tests.cpp -c real 0m2.291s user 0m2.123s sys 0m0.160s This means that phoenix V3 is up to 50% faster than V2! This is all depending on the usecase of course. Other testcase show that V3 is either as fast to compile or a little faster!
Thanks! I double that! Thanks! Thomas
participants (13)
-
Christophe Henry
-
Cromwell Enage
-
Daniel Walker
-
Eric Niebler
-
Eric Niebler
-
Hartmut Kaiser
-
Jeffrey Lee Hellrung, Jr.
-
Joel de Guzman
-
Mathias Gaunard
-
Steven Watanabe
-
Thomas Heller
-
Thomas Klimpel
-
Vicente BOTET