
For my thesis work I wrote a DSL framework based on fusion-2. It fulfills a similar purpose like boost::xpressive's proto. Compared to proto, the framework has a rule system, allowing the user define the language rules of a domain. Is there any interest in a library like that? Some additional information: All operators defined by the framework verify the correctness of expressions querying the domain rules the expression parameters belong to, using EnableIf. Foreign types can be integrated into a domain by specifing adaptation rules. The framework separates operators (or DSL functions) from language rules. Consequently rules may introduces this layer of indirection to achieve a consistent definition of language rules. Expressions have a varying set of attributes stored in a fusion-2 map, e.g. the expressions domain is identified by an attribute inside that map. The set of attributes is controlled by the domain rules. Apart from returning the validity of expressions, rules also have to provide a result type. Hence the rule code tends to grow complex, with an increasing amount of attributes. The library lacks a proper documentation, and requires some interface changes to simplify attribute update operations. Before working on that, I wanted to see if there is interest at all. I used the framework to implement a matrix library with a signle frontend and multiple backends, that could be picked and combined on a by expression basis. Regards Andreas Pokorny

Andreas Pokorny wrote:
For my thesis work I wrote a DSL framework based on fusion-2. It fulfills a similar purpose like boost::xpressive's proto. Compared to proto, the framework has a rule system, allowing the user define the language rules of a domain.
Is there any interest in a library like that?
Some additional information: All operators defined by the framework verify the correctness of expressions querying the domain rules the expression parameters belong to, using EnableIf. Foreign types can be integrated into a domain by specifing adaptation rules.
The framework separates operators (or DSL functions) from language rules. Consequently rules may introduces this layer of indirection to achieve a consistent definition of language rules.
Expressions have a varying set of attributes stored in a fusion-2 map, e.g. the expressions domain is identified by an attribute inside that map. The set of attributes is controlled by the domain rules. Apart from returning the validity of expressions, rules also have to provide a result type. Hence the rule code tends to grow complex, with an increasing amount of attributes.
The library lacks a proper documentation, and requires some interface changes to simplify attribute update operations. Before working on that, I wanted to see if there is interest at all.
I used the framework to implement a matrix library with a signle frontend and multiple backends, that could be picked and combined on a by expression basis.
I'm interested. I'd like to see some examples; especially on how the "rule system" works. I am also interested to know how a domain can be adapted for use by 2 or more libraries: xpressive and spirit being main examples. IOTW, in the future, spirit and xpressive parts can be shared. To be honest, right now, for Spirit-2, I am inclined to use Eric's proto. It's been there for quite some time now and is quite mature. But, like your library, there are no docs yet (as far as I know). I'm not sure when Eric will have the time to get it outside xpressive as a standalone ET library, for review. At any rate, it would be interesting if you can form a collaboration with him. So, instead of having 2 separate ET entries, we can just have one that has the the best of both. Is that possible at all? Or is your library too different from Proto? Cheers, -- Joel de Guzman http://www.boost-consulting.com http://spirit.sf.net

Joel de Guzman wrote:
Andreas Pokorny wrote:
For my thesis work I wrote a DSL framework based on fusion-2. It fulfills a similar purpose like boost::xpressive's proto. Compared to proto, the framework has a rule system, allowing the user define the language rules of a domain.
Sounds interesting. Proto would be better with some way of constraining the legal expressions within a domain.
Is there any interest in a library like that?
Yes.
Some additional information: All operators defined by the framework verify the correctness of expressions querying the domain rules the expression parameters belong to, using EnableIf.
This sounds good. I wonder what the impact is on compile time.
Foreign types can be integrated into a domain by specifing adaptation rules.
The framework separates operators (or DSL functions) from language rules. Consequently rules may introduces this layer of indirection to achieve a consistent definition of language rules.
I'd like an example.
Expressions have a varying set of attributes stored in a fusion-2 map, e.g. the expressions domain is identified by an attribute inside that map. The set of attributes is controlled by the domain rules. Apart from returning the validity of expressions, rules also have to provide a result type. Hence the rule code tends to grow complex, with an increasing amount of attributes.
More examples, please!
The library lacks a proper documentation, and requires some interface changes to simplify attribute update operations. Before working on that, I wanted to see if there is interest at all.
I used the framework to implement a matrix library with a signle frontend and multiple backends, that could be picked and combined on a by expression basis.
I'm interested. I'd like to see some examples; especially on how the "rule system" works. I am also interested to know how a domain can be adapted for use by 2 or more libraries: xpressive and spirit being main examples. IOTW, in the future, spirit and xpressive parts can be shared.
Joel, Hartmut Kaiser and I have spent a lot of time working out the details of an expression template system that would work for xpressive, Spirit-2 and Hartmut's Karma library. The discussions were public on spirit-devel. Search the archives for "Spirit-2 and Subversion", "Scanner Busines, R.I.P.", "Proto Compiler Visitors", "Fusion-ifying proto parse trees", and "segmented fusion - a-ha!" In particular, this is the message where the strategy to unify all the ET libraries really crystallized: http://permalink.gmane.org/gmane.comp.parsers.spirit.devel/2620 It would be great if you could share your thoughts on how you would approach these problems with your new ET library.
To be honest, right now, for Spirit-2, I am inclined to use Eric's proto. It's been there for quite some time now and is quite mature. But, like your library, there are no docs yet (as far as I know). I'm not sure when Eric will have the time to get it outside xpressive as a standalone ET library, for review. At any rate, it would be interesting if you can form a collaboration with him. So, instead of having 2 separate ET entries, we can just have one that has the the best of both. Is that possible at all? Or is your library too different from Proto?
I too would be interested in hearing more about your ideas and helping to build a unified ET framework for boost. -- Eric Niebler Boost Consulting www.boost-consulting.com

On 08/06/2006 11:51 PM, Eric Niebler wrote:
In particular, this is the message where the strategy to unify all the ET libraries really crystallized:
http://permalink.gmane.org/gmane.comp.parsers.spirit.devel/2620
It would be great if you could share your thoughts on how you would approach these problems with your new ET library.
I'd like to see how empty, first, and follow attributes for a grammar can be calculated. See: LewiLL1_code_gen.html in http://boost-consulting.com/vault/Strings - Text Processing for a definition and an implementation in http://boost.cvs.sourceforge.net/boost-sandbox/boost-sandbox/libs/grammar_pi... However, that implementation is a run-time calculation, but maybe proto would be able to do a compile-time calculation.

On 08/07/2006 10:54 AM, Larry Evans wrote:
On 08/06/2006 11:51 PM, Eric Niebler wrote: [snip]
I'd like to see how empty, first, and follow attributes for a grammar can be calculated. See:
LewiLL1_code_gen.html
in http://boost-consulting.com/vault/Strings - Text Processing for a definition
OOPS. That's just a scheme for generating parser once the first/follow sets are calculated (and used to define the direction() methods for grammar terms). If anyone's interested, I could checkout the book once more and reproduce the scheme for the calculation.
and an implementation in
http://boost.cvs.sourceforge.net/boost-sandbox/boost-sandbox/libs/grammar_pi...
However, I believe the above code (or that in: boost/grammar_pipeline/eff/productions.hpp) directly reflects the scheme in the book.
However, that implementation is a run-time calculation, but maybe proto would be able to do a compile-time calculation.

On Sun, Aug 06, 2006 at 09:51:27PM -0700, Eric Niebler <eric@boost-consulting.com> wrote:
The library lacks a proper documentation, and requires some interface changes to simplify attribute update operations. Before working on that, I wanted to see if there is interest at all.
I used the framework to implement a matrix library with a signle frontend and multiple backends, that could be picked and combined on a by expression basis.
I'm interested. I'd like to see some examples; especially on how the "rule system" works. I am also interested to know how a domain can be adapted for use by 2 or more libraries: xpressive and spirit being main examples. IOTW, in the future, spirit and xpressive parts can be shared.
Joel, Hartmut Kaiser and I have spent a lot of time working out the details of an expression template system that would work for xpressive, Spirit-2 and Hartmut's Karma library. The discussions were public on spirit-devel. Search the archives for "Spirit-2 and Subversion", "Scanner Busines, R.I.P.", "Proto Compiler Visitors", "Fusion-ifying proto parse trees", and "segmented fusion - a-ha!"
In particular, this is the message where the strategy to unify all the ET libraries really crystallized:
http://permalink.gmane.org/gmane.comp.parsers.spirit.devel/2620
It would be great if you could share your thoughts on how you would approach these problems with your new ET library.
I did not yet manage to read all email, but from what I read, you seem to plan using segmented sequences to encode the complete expression tree. The framework instead stores the expression tree mainly in a fusion map. But I think it makes sense to support both encoding variantes. With segmented sequences attributes will not work, and probably not required.
To be honest, right now, for Spirit-2, I am inclined to use Eric's proto. It's been there for quite some time now and is quite mature. But, like your library, there are no docs yet (as far as I know). I'm not sure when Eric will have the time to get it outside xpressive as a standalone ET library, for review. At any rate, it would be interesting if you can form a collaboration with him. So, instead of having 2 separate ET entries, we can just have one that has the the best of both. Is that possible at all? Or is your library too different from Proto?
I too would be interested in hearing more about your ideas and helping to build a unified ET framework for boost.
unified .. thats the reason why I intended to call the library Boost.Unification :) Regards Andreas Pokorny

Andreas Pokorny wrote:
Joel, Hartmut Kaiser and I have spent a lot of time working out the details of an expression template system that would work for xpressive, Spirit-2 and Hartmut's Karma library. The discussions were public on spirit-devel. Search the archives for "Spirit-2 and Subversion", "Scanner Busines, R.I.P.", "Proto Compiler Visitors", "Fusion-ifying proto parse trees", and "segmented fusion - a-ha!"
In particular, this is the message where the strategy to unify all the ET libraries really crystallized:
http://permalink.gmane.org/gmane.comp.parsers.spirit.devel/2620
It would be great if you could share your thoughts on how you would approach these problems with your new ET library.
I did not yet manage to read all email, but from what I read, you seem to plan using segmented sequences to encode the complete expression tree. The framework instead stores the expression tree mainly in a fusion map. But I think it makes sense to support both encoding variantes. With segmented sequences attributes will not work, and probably not required.
I guess I'm still not clear about "attributes". What are those, really? And why will attributes not work with segmented sequences? or not required?
To be honest, right now, for Spirit-2, I am inclined to use Eric's proto. It's been there for quite some time now and is quite mature. But, like your library, there are no docs yet (as far as I know). I'm not sure when Eric will have the time to get it outside xpressive as a standalone ET library, for review. At any rate, it would be interesting if you can form a collaboration with him. So, instead of having 2 separate ET entries, we can just have one that has the the best of both. Is that possible at all? Or is your library too different from Proto?
I too would be interested in hearing more about your ideas and helping to build a unified ET framework for boost.
unified .. thats the reason why I intended to call the library Boost.Unification :)
Not sure I like the name ;) Regards, -- Joel de Guzman http://www.boost-consulting.com http://spirit.sf.net

On Fri, Aug 11, 2006 at 09:48:08AM +0800, Joel de Guzman <joel@boost-consulting.com> wrote:
Andreas Pokorny wrote:
Joel, Hartmut Kaiser and I have spent a lot of time working out the details of an expression template system that would work for xpressive, Spirit-2 and Hartmut's Karma library. The discussions were public on spirit-devel. Search the archives for "Spirit-2 and Subversion", "Scanner Busines, R.I.P.", "Proto Compiler Visitors", "Fusion-ifying proto parse trees", and "segmented fusion - a-ha!"
In particular, this is the message where the strategy to unify all the ET libraries really crystallized:
http://permalink.gmane.org/gmane.comp.parsers.spirit.devel/2620
It would be great if you could share your thoughts on how you would approach these problems with your new ET library.
I did not yet manage to read all email, but from what I read, you seem to plan using segmented sequences to encode the complete expression tree. The framework instead stores the expression tree mainly in a fusion map. But I think it makes sense to support both encoding variantes. With segmented sequences attributes will not work, and probably not required.
I guess I'm still not clear about "attributes". What are those, really? And why will attributes not work with segmented sequences? or not required?
An attribute is a compile time or compile time and runtime information stored for an expression tree, and identified by a key type. E.g. a matrix data structure, could provide information about itself inside of attributes: template<typename DerivedType, typename T, typename DimensionType, typename StorageType> struct basic_matrix : public basic_expression< fusion::map< // Attributes and their meaning ct_pair<operator_tag,value_tag> // node is a leaf in the tree , ct_pair<domain_tag,linear_algebra_domain_tag> // domain of the node , fusion::pair<dimension_tag,DimensionType> // dimensiontype of the tree, // could be runtime dynamic dimension, or compile time constants , ct_pair<category_tag,matrix_tag> // node is a matrix , ct_pair<value_type_tag,T> // the element type of the values returned by this node >
{}; This information is stored inside a container and wrapped inside key-value pairs, to be able to write generic rules. Rules not coupled to the attribues they process. So rule can simply take all attributes of either operands and combine them where possible, or update their values/types if required and store the results inside the expression encoding data structure returned by the rule. To define an attribute one needs a key type, and some meta functions which get called by the rule, e.g. if an attribute occurs on both operands of a node, a combine meta function is called. My statement about attributes and segmented sequences is probably wrong. Attributes just need a fusion::map and rules that invoke the automated attribute processing. Regards Andreas Pokorny

Andreas Pokorny wrote:
I guess I'm still not clear about "attributes". What are those, really? And why will attributes not work with segmented sequences? or not required?
An attribute is a compile time or compile time and runtime information stored for an expression tree, and identified by a key type.
[snip example]
This information is stored inside a container and wrapped inside key-value pairs, to be able to write generic rules. Rules not coupled to the attribues they process. So rule can simply take all attributes of either operands and combine them where possible, or update their values/types if required and store the results inside the expression encoding data structure returned by the rule.
To define an attribute one needs a key type, and some meta functions which get called by the rule, e.g. if an attribute occurs on both operands of a node, a combine meta function is called.
My statement about attributes and segmented sequences is probably wrong. Attributes just need a fusion::map and rules that invoke the automated attribute processing.
Ah! Pretty cool! Regards, -- Joel de Guzman http://www.boost-consulting.com http://spirit.sf.net

On Mon, Aug 07, 2006 at 07:47:43AM +0800, Joel de Guzman <joel@boost-consulting.com> wrote:
To be honest, right now, for Spirit-2, I am inclined to use Eric's proto. It's been there for quite some time now and is quite mature. But, like your library, there are no docs yet (as far as I know). I'm not sure when Eric will have the time to get it outside xpressive as a standalone ET library, for review. At any rate, it would be interesting if you can form a collaboration with him. So, instead of having 2 separate ET entries, we can just have one that has the the best of both. Is that possible at all? Or is your library too different from Proto?
The whole thing was derived from proto, I changed some type names. Added a rule system. Then the domain_tag used for rule set dispatching, and finally the fusion-2 map to support attributes. With the attributes a DSL can be defined as a S-attributed grammar. There is yet another feature I havent mentioned yet. It is also possible to extend existing grammars. In a way that all existing rules are reused if no rule of the derived domain matches. This might help you defining the differences of xpressive spirit and karma. I think it is time to explain how the system works: Some helper strucutures required to explain the system: struct defined {}; //< used to mark a rule as 'defined', required for // domain extension/reuse sturct undefined{}; //< the opposite, rules marked with undefined do not apply in a domain // the default implementation of rule metafunctions // are marked as undefined template<typename DomainTag> struct extends { typedef mpl::void_ type; }; // Usage: // A extends B ==> template<> struct extends<A> { typedef B type; }; Rules: ---------- Rules are boolean metafunction with an additional return type called 'result_type' and a static init-member function which only returns a value of result_type. The default rule: template <typename DomainTag,typename OperatorTag,typename LeftType, typename RightType = void,typename EnableIfT = void > struct rule : undefined,mpl::false_ { }; The first parameter identifies the domain, the rule will be used for. The OperatorTag is just like in proto, a type_tag to identifiy operators or functions. LeftType and RightType just hold the operand types of the expression. RightType stays void if OperatorTag is an unary operator. As you can see the system would also benefit from variadic templates. I considered having rule_1 rule_2 rule_3 .. for unary binary operators/functions, and rule_3 for functions with three parameters. The rule template above only works for unary and binary functions/operators. This is one of the interface changes I planed for the near future. Domains: --------- To define a domain a domain_tag has to be created: struct linear_algebra_domain_tag{}; In linear algebra there are several elementwise operations like add and subtract that only work on objects of equal dimensions. To capture all these operations in a single rule and with the help of traits, we have to write this rule specialisation: template<typename OperatorTag,typename LeftType,typename RightType> struct rule<linear_algebra_domain_tag,OperatorTag,LeftType,RightType, typename enable_if< mpl::and_< math_lib::requires_equal_dimension<OperatorTag> , math_lib::have_equal_dimension<LeftType,RightType> > >::type > : defined,mpl::true_ { typedef /* omitted */ result_type; static result_type init( LeftType const& left, RightType const& right ) { return ....; } }; Note that we now derive from defined, to tell the rule system that this rule has to be applied, if the template parameter substition succeeded. The mpl::true_ is the boolean response of the metafunction, telling the rule system that the expression is valid. A domain is defined by a set of rules. Attributes are somehow outside of domains, but can and should be used for validity check tests in rules. But there is no coupling between a certain domain or a set of rules and the attributes. The code of have_equal_dimension accesses the dimension attribute of the two operand types. Operators and rule invocation: ------------------------------ The framework already provides overloads for all operators, and provides a macro to define additional functions which also invoke the rule system. This is simple defintion of operator+: template<typename LeftT, typename RightT> inline typename lazy_enable_if< mpl::and_< \ is_expression<LeftT> , is_expression<RightT> , is_binary_expression_allowed<add_tag,LeftT,RightT> > , get_result_type<add_tag,LeftT,RightT>
::type operator +( LeftT const& left, RightT const& right ) { return get_rule<add_tag,LeftT,RightT>::type::init( left, right ); }
With the adaptation of foreign types, the operator overload becomes a little more complex. is_binary_expression_allowed calls get_rule<..>::type, which does all the work. In get_rule<OP,L,R> the domain_tag attributes of the left and right operand type are accessed. If the domain_tag types differ a user sepcializeable meta function is called to resolve the conflict. With the final domain_tag a metafunction get_rule_impl is invoked: template<typename DomainTag, typename Op,typename L, typename R> struct get_rule_impl : lazy_if< mpl::or_< is_defined<rule<DomainTag,Op,L,R> > // returns true if the instantiated rule derives from defined. , is_same<typename extends<DomainTag>::type,mpl::void_> > // then: , lazy<rule<DomainTag,Op,L,R> > // else: , get_rule_impl<typename extends<DomainTag>::type,Op,L,R> > {}; This code recusively walks through all rules until a 'defined' rule, or no further extended domain is found. I hope this sheds enough light onto the rule system. There are two further topics to cover the expression tree encoding and the attributes. Expression Encoding: --------------------- template<typename AttributeMapType> struct basic_expression : expression_node // expression_node is used inside of is_expression<T> from above { typedef AttributeMapType attribute_map_type; typedef AttributeMapType tuple_type; tuple_type attributes; basic_expression( tuple_type const& t ) : attributes(t) {} }; template<typename L,typename L, typename AttributeMapType> struct binary_expression : basic_expression<AttributeMapType> { typedef typename call_traits<L>::value_type left_operand_type; typedef typename call_traits<L>::value_type right_operand_type; typedef AttributeMapType tuple_type; left_operand_type left_operand; right_operand_type right_operand; binary_expression( L const& lop, R const& rop, tuple_type const& t ) : basic_expression<AttributeMapType>(t) , left_operand( lop ) , right_operand( rop ) {} }; // similar code for unary_expression .. Right now three structure templates are available to encode nodes in the expression tree. From my current point of view the first one should be sufficient, since all other information can be stored in the attribute map. The information stored in attributes of the operands can be updated and forwared to expression encoding type by calling user defined update/combine Metafunction. I think thats enough for this email at least. I hope I find more time to write tomorrow Best Regards Andreas Pokorny

Andreas Pokorny wrote:
On Mon, Aug 07, 2006 at 07:47:43AM +0800, Joel de Guzman <joel@boost-consulting.com> wrote:
To be honest, right now, for Spirit-2, I am inclined to use Eric's proto. It's been there for quite some time now and is quite mature. But, like your library, there are no docs yet (as far as I know). I'm not sure when Eric will have the time to get it outside xpressive as a standalone ET library, for review. At any rate, it would be interesting if you can form a collaboration with him. So, instead of having 2 separate ET entries, we can just have one that has the the best of both. Is that possible at all? Or is your library too different from Proto?
The whole thing was derived from proto, I changed some type names. Added a rule system. Then the domain_tag used for rule set dispatching, and finally the fusion-2 map to support attributes. With the attributes a DSL can be defined as a S-attributed grammar.
Ok... [snip explanations]
I think thats enough for this email at least. I hope I find more time to write tomorrow
Not sure if I follow. I'm an examples type of guy. It would be enlightening if you can provide a simple example. Say, we want to write a DSEL which has primitives a_, b_ and c_ (objects) of types A B and C and operations >> and |. How shall I develop the ET using your library? Example usage: a_ >> b_ | c_ Regards, -- Joel de Guzman http://www.boost-consulting.com http://spirit.sf.net

On Fri, Aug 11, 2006 at 10:02:18AM +0800, Joel de Guzman <joel@boost-consulting.com> wrote:
Andreas Pokorny wrote:
On Mon, Aug 07, 2006 at 07:47:43AM +0800, Joel de Guzman <joel@boost-consulting.com> wrote:
To be honest, right now, for Spirit-2, I am inclined to use Eric's proto. It's been there for quite some time now and is quite mature. But, like your library, there are no docs yet (as far as I know). I'm not sure when Eric will have the time to get it outside xpressive as a standalone ET library, for review. At any rate, it would be interesting if you can form a collaboration with him. So, instead of having 2 separate ET entries, we can just have one that has the the best of both. Is that possible at all? Or is your library too different from Proto?
The whole thing was derived from proto, I changed some type names. Added a rule system. Then the domain_tag used for rule set dispatching, and finally the fusion-2 map to support attributes. With the attributes a DSL can be defined as a S-attributed grammar.
Ok...
[snip explanations]
I think thats enough for this email at least. I hope I find more time to write tomorrow
Not sure if I follow. I'm an examples type of guy. It would be enlightening if you can provide a simple example. Say, we want to write a DSEL which has primitives a_, b_ and c_ (objects) of types A B and C and operations >> and |. How shall I develop the ET using your library?
Example usage:
a_ >> b_ | c_
Given that a_ >> b_ invokes the operator of the library, e.g. by deriving at least A from a library type, and given the domain_tag can be evaluated from the participating datastructures. These rules have to be written: // domain: struct example_domain {}; // rules: template<typename EnableIfT> struct rule<example_domain,right_shift_tag,A,B,EnableIfT> : defined, mpl::true_ { typedef TYPE_WHICH_ENCODES_THE_EXPRESSION result_type; static result_type init(A const& a, B const& b) { return result_type(...); } }; template<typename EnableIfT> struct rule<example_domain,bitor_tag,TYPE_WHICH_ENCODES_THE_EXPRESSION,C,EnableIfT> : defined, mpl::true_ { typedef TYPE_WHICH_ENCODES_THE_WHOLE_TREE result_type; static result_type init(A const& a, B const& b) { return result_type(...); } }; This weekend I plan to work on the code, clean things up a bit, and try to reintegrate everthing into proto again. Best Regards, Andreas Pokorny

Andreas Pokorny wrote:
Not sure if I follow. I'm an examples type of guy. It would be enlightening if you can provide a simple example. Say, we want to write a DSEL which has primitives a_, b_ and c_ (objects) of types A B and C and operations >> and |. How shall I develop the ET using your library?
Example usage:
a_ >> b_ | c_
Given that a_ >> b_ invokes the operator of the library, e.g. by deriving at least A from a library type, and given the domain_tag can be evaluated from the participating datastructures. These rules have to be written:
// domain: struct example_domain {};
// rules: template<typename EnableIfT> struct rule<example_domain,right_shift_tag,A,B,EnableIfT> : defined, mpl::true_ { typedef TYPE_WHICH_ENCODES_THE_EXPRESSION result_type; static result_type init(A const& a, B const& b) { return result_type(...); } };
template<typename EnableIfT> struct rule<example_domain,bitor_tag,TYPE_WHICH_ENCODES_THE_EXPRESSION,C,EnableIfT> : defined, mpl::true_ { typedef TYPE_WHICH_ENCODES_THE_WHOLE_TREE result_type; static result_type init(A const& a, B const& b) { return result_type(...); } };
Do we have to write rules for all permutations of A and B and C? What about literals and built-in types? Example: a_ >> b_ >> "hi" | c_ >> 12345
This weekend I plan to work on the code, clean things up a bit, and try to reintegrate everthing into proto again.
Sounds good. Thanks! Cheers, -- Joel de Guzman http://www.boost-consulting.com http://spirit.sf.net

On Sat, Aug 12, 2006 at 07:02:07AM +0800, Joel de Guzman <joel@boost-consulting.com> wrote:
Do we have to write rules for all permutations of A and B and C? What about literals and built-in types? Example:
a_ >> b_ >> "hi" | c_ >> 12345
Of course not. This was just for the example. Usually one would separate the different types into categories or concept, and define rules on top of them. For example in a matrix vector library, one might define the categories matrix, column vector, row vector and scalar. built-in types can be handled by the type adaptation rules: template<typename DomainTag,typename OperatorTag, typename ForeignType, typename EnableIfT = void> struct adapt_type : undefined, mpl::false_ {}; There are also two similar metafunctions called adapt_left and adapt_right specialziable for the user, which call adapt_type in the default implementation. adapt_* just works exactly like rule, the result_type of adapt_* is then used for a rule invokaction. Best Regards Andreas Pokorny

Andreas Pokorny wrote:
On Sat, Aug 12, 2006 at 07:02:07AM +0800, Joel de Guzman <joel@boost-consulting.com> wrote:
Do we have to write rules for all permutations of A and B and C? What about literals and built-in types? Example:
a_ >> b_ >> "hi" | c_ >> 12345
Of course not. This was just for the example. Usually one would separate the different types into categories or concept, and define rules on top of them. For example in a matrix vector library, one might define the categories matrix, column vector, row vector and scalar.
Sorry, that was a dumb question. That's what categories are for, I know. So, anyway, it would really help if you could provide a short close-to-real-world tutorial. Regards, -- Joel de Guzman http://www.boost-consulting.com http://spirit.sf.net

"Andreas Pokorny" <andreas.pokorny@gmx.de> wrote in message <...>
// rules: template<typename EnableIfT> struct rule<example_domain,right_shift_tag,A,B,EnableIfT> : defined, mpl::true_ { typedef TYPE_WHICH_ENCODES_THE_EXPRESSION result_type; static result_type init(A const& a, B const& b) { return result_type(...); } };
template<typename EnableIfT> struct rule<example_domain,bitor_tag,TYPE_WHICH_ENCODES_THE_EXPRESSION,C,EnableIfT> : defined, mpl::true_ { typedef TYPE_WHICH_ENCODES_THE_WHOLE_TREE result_type; static result_type init(A const& a, B const& b) { return result_type(...); } };
I don't understand what these TYPE_WHICH_ENCODES_THE macros are, but can 't you use Boost.Typeof for this? regards Andy Little

Andy Little wrote:
"Andreas Pokorny" <andreas.pokorny@gmx.de> wrote in message <...>
// rules: template<typename EnableIfT> struct rule<example_domain,right_shift_tag,A,B,EnableIfT> : defined, mpl::true_ { typedef TYPE_WHICH_ENCODES_THE_EXPRESSION result_type; static result_type init(A const& a, B const& b) { return result_type(...); } };
template<typename EnableIfT> struct rule<example_domain,bitor_tag,TYPE_WHICH_ENCODES_THE_EXPRESSION,C,EnableIfT> : defined, mpl::true_ { typedef TYPE_WHICH_ENCODES_THE_WHOLE_TREE result_type; static result_type init(A const& a, B const& b) { return result_type(...); } };
I don't understand what these TYPE_WHICH_ENCODES_THE macros are, but can 't you use Boost.Typeof for this?
IIUC, the "rule" specifies the result type (not deduces the result type). I think it's not a macro at all; it would rather be something like: "unspecified_type", or "you-define-your-type-here-and-ill-do-the-rest" Regards, -- Joel de Guzman http://www.boost-consulting.com http://spirit.sf.net

"Joel de Guzman" wrote
Andy Little wrote:
"Andreas Pokorny" wrote
template<typename EnableIfT> struct rule<example_domain,bitor_tag,TYPE_WHICH_ENCODES_THE_EXPRESSION,C,EnableIfT> : defined, mpl::true_ { typedef TYPE_WHICH_ENCODES_THE_WHOLE_TREE result_type; static result_type init(A const& a, B const& b) { return result_type(...); } };
I don't understand what these TYPE_WHICH_ENCODES_THE macros are, but can 't you use Boost.Typeof for this?
IIUC, the "rule" specifies the result type (not deduces the result type). I think it's not a macro at all; it would rather be something like: "unspecified_type", or "you-define-your-type-here-and-ill-do-the-rest"
Well again I am probably not understanding it fully, but from one of Andreas posts higher in the thread: get_result_type<add_tag,LeftT,RightT> presumably could be implemented as: template <typename LeftT, typename RightT> struct get_result_type<add_tag,LeftT,RightT> { typedef BOOST_TYPEOF_TPL(LeftT() + RightT()) type; }; Anyway apologies if I have missed the point. regards Andy Little

Hi, "Andy Little" wrote
"Joel de Guzman" wrote
IIUC, the "rule" specifies the result type (not deduces the result type). I think it's not a macro at all; it would rather be something like: "unspecified_type", or "you-define-your-type-here-and-ill-do-the-rest"
Well again I am probably not understanding it fully, but from one of Andreas posts higher in the thread:
get_result_type<add_tag,LeftT,RightT>
presumably could be implemented as:
template <typename LeftT, typename RightT> struct get_result_type<add_tag,LeftT,RightT> { typedef BOOST_TYPEOF_TPL(LeftT() + RightT()) type; };
That would be a hen-egg situation. The result_type of the operator is defined by the rules, not the other way around. Like the Joel said, the encoding of the expression tree into operator and function return types is up to the user. The frontend of the DSLs built with this framework consist of overloaded operators (already provided by the framework), and small user defined inline functions (probably created using a framework macro). These functions invoke the rule system with the operands, and an operator tag which identifies the function or operator. The result type is deduced by finding a matching rule, the result (value) is returned after calling a static method inside the rule found. So the user might even refuse to encode the expression inside a rule, but evaluate the expression directly and return the result of the evaluation. Regards Andreas Pokorny -- "Feel free" – 10 GB Mailbox, 100 FreeSMS/Monat ... Jetzt GMX TopMail testen: http://www.gmx.net/de/go/topmail

Andreas Pokorny wrote:
Hi,
"Andy Little" wrote
IIUC, the "rule" specifies the result type (not deduces the result type). I think it's not a macro at all; it would rather be something like: "unspecified_type", or "you-define-your-type-here-and-ill-do-the-rest" Well again I am probably not understanding it fully, but from one of Andreas
"Joel de Guzman" wrote posts higher in the thread:
get_result_type<add_tag,LeftT,RightT>
presumably could be implemented as:
template <typename LeftT, typename RightT> struct get_result_type<add_tag,LeftT,RightT> { typedef BOOST_TYPEOF_TPL(LeftT() + RightT()) type; };
That would be a hen-egg situation. The result_type of the operator is defined by the rules, not the other way around. Like the Joel said, the encoding of the expression tree into operator and function return types is up to the user.
The frontend of the DSLs built with this framework consist of overloaded operators (already provided by the framework), and small user defined inline functions (probably created using a framework macro). These functions invoke the rule system with the operands, and an operator tag which identifies the function or operator. The result type is deduced by finding a matching rule, the result (value) is returned after calling a static method inside the rule found.
So the user might even refuse to encode the expression inside a rule, but evaluate the expression directly and return the result of the evaluation.
Right. But then again, the ET framework will always benefit from typeof/auto: auto csv = int_ >> *(',' >> int_); or, in the interim: BOOST_AUTO(csv, int_ >> *(',' >> int_)); Regards, -- Joel de Guzman http://www.boost-consulting.com http://spirit.sf.net

"Andreas Pokorny" wrote
Hi,
"Andy Little" wrote
"Joel de Guzman" wrote
IIUC, the "rule" specifies the result type (not deduces the result type). I think it's not a macro at all; it would rather be something like: "unspecified_type", or "you-define-your-type-here-and-ill-do-the-rest"
Well again I am probably not understanding it fully, but from one of Andreas posts higher in the thread:
get_result_type<add_tag,LeftT,RightT>
presumably could be implemented as:
template <typename LeftT, typename RightT> struct get_result_type<add_tag,LeftT,RightT> { typedef BOOST_TYPEOF_TPL(LeftT() + RightT()) type; };
That would be a hen-egg situation. The result_type of the operator is defined by the rules, not the other way around. Like the Joel said, the encoding of the expression tree into operator and function return types is up to the user.
So the user must create special types solely for the purposes of the framework? cant I use the E.T engine on (say) 2 int's: int a,b; ETexpression_< ETexpression_<int>, plus_tag, ETexpression_<int>
::result_type result = ETexpression(a) + ETexpression(b);
regards Andy Little

If I understand correctly, one can use your proposed DSL framework to implement things like dimensional analysis for physical quantities and for much simpler case of money currencies, as well as for any domain specifics? Am I right? If so, I'm very interested in such a framework. I've needed it yesterday ;-) Best, Oleg Abrosimov.

On Tue, Aug 15, 2006 at 10:41:38PM +0700, Oleg Abrosimov <beholder@gorodok.net> wrote:
If I understand correctly, one can use your proposed DSL framework to implement things like dimensional analysis for physical quantities and for much simpler case of money currencies, as well as for any domain specifics? Am I right? If so, I'm very interested in such a framework. I've needed it yesterday ;-)
Yes, you are right. That was one of my intentions when adding the attributes to the framework. I am sorry that you wont see the framework tomorrow. It will take a few weeks until I have it in a distributeable state, but I will work on it. Best Regards, Andreas Pokorny
participants (7)
-
Andreas Pokorny
-
Andreas Pokorny
-
Andy Little
-
Eric Niebler
-
Joel de Guzman
-
Larry Evans
-
Oleg Abrosimov