
On Tue, Feb 19, 2013 at 4:17 PM, Nathan Ridge <zeratul976@hotmail.com>wrote: ...
This was discussed upthread. To summarize, the standard library makes the design choice of not providing certain operations if they cannot be implemented with a certain asymptotic complexity (for example, subtraction of two iterators if it cannot be implemented in constant time), and Boost is trying to be consistent with this design choice.
I agree that consistency of the library design is important, but the case is not completely trivial. IMO, this design principle is only a theoretical one. In practice, it is often unhelpful and counterproductive. Because of this principle, STL and C++ miss the big selling point. An end user (who pays for and uses a software product) is interested in the best running time of an algorithm which is determined not only by costs of specific operations, but also by their frequencies. Relatively small number of inefficient operations does not affect the total computational cost of an algorithm. Also, this design principle makes illegal advanced and/or augmented data structures which are beneficial for many complex algorithms. For example, insert() and erase() functions for a single element with O(log N) cost are not allowed by the C++ standard, however, significantly less efficient O(N) operations are allowed for std::vector. I think that the current standard specifications for computational complexities of operations are to some degree obsolete. They create unnecessary restrictions that will be either amended or removed completely in future versions of STL. Regards, Vadim Stadnik