
On 04/09/10 10:48, Mathias Gaunard wrote:
Larry Evans wrote:
What is there about the XPath specification that makes any type hierarchy for modelling it less suitable than using something akin to boost.variant?
You see, I'm wondering because using type hierarchies and virtual functions has been touted as a great advantage of OO programming; yet, it apparently lacks something which you need. I'd like to understand what that is.
Some could argue that the point of a base class is moot if you have to downcast it to make anything useful with it,
But boost.variant has to do the equivalent of downcasting based on discriminant (the value returned by which()). You could argue variant does that automatically, but then it might throw and exception if the target of the assignment(e.g.) was the wrong type. Of course, you could check the discriminant before the assignment, but then, how's that different than what has to be done with virtual functions (using dynamic_cast). Of course one could argue that the variant library's apply_visitor does all this checking for you before sending the correct type to your actual visitor; however, the apply_visitor is little different than the elements' (using the term from visitor pattern, http://en.wikipedia.org/wiki/Visitor_pattern ) virtual accept functions.
and that algebraic data types (variant-like things) are a much more elegant solution when you need to visit the different cases.
I'm still not seeing it :( I thought algebraic data types were one thing OO programming did well. For example, a stack is and ADT and the stl library has a stack. AFAICT, every component in boost.variant is like a element role in the visitor_pattern. -confusedly yours, Larry