
Joel de Guzman <joel@boost-consulting.com> writes:
Consider an immediate fusion algorithm:
transform(s, f)
which returns another s immediately. Now consider a corresponding phoenix lazy function:
phoenix::transform(_1, f)
phoenix::transform(_1, _2) would correspond more closely.
This version returns a a curried (ahem.. partially applied) function that is lazily evaluated, later, when you pass in a sequence, s.
Yet...
Well, it turns out that I am wrong. Such a scheme will suffer because ,while the example above is as lazy as it can get, composition of algos such as:
transform(remove_if(_1, f1), f2)
will turn out to be unoptimal because the immediate function being forwarded to (the immediate versions of transform and remove_if) will both return intermediate structures, unlike with current fusion, where no intermediates are ever created; just views.
Yep.
Well, the use of segmented iterators and hierarchical algorithms (http://lafstern.org/matt/segmented.pdf) has some interesting implications. You can't just use iterator adaptors such as transform_iterator to implement laziness unless either:
1. You want to pay a big efficiency hit
or
2. The adaptors expose the same "extended properties" (e.g. segmented_iterator_traits) as the iterators they're adapting. That requires knowledge of all extended properties.
It seems to me that in order to do the traversal optimally, more than laziness is required: you'd have to build expression templates and then somehow analyze the traversal properties of the leaf sequences in order to evaluate the expressions.
After reading the doc "segmented.pdf", I see what you mean. I am not sure, but perhaps views that take advantage of the segmentation information will work as well as ETs because these views can exploit the traversal properties of the sequences they are acting on.
The real problem with views, if you care about optimal efficiency as we do in the MTL, is that in real life iterator adaptors will impose an abstraction penalty when applied to pointers, at least on most compilers :( -- Dave Abrahams Boost Consulting www.boost-consulting.com