
On Tue, Feb 19, 2013 at 2:04 AM, Vadim Stadnik <vadimstdk@gmail.com> wrote:
On Tue, Feb 19, 2013 at 4:17 PM, Nathan Ridge <zeratul976@hotmail.com
wrote: ...
This was discussed upthread. To summarize, the standard library makes the design choice of not providing certain operations if they cannot be implemented with a certain asymptotic complexity (for example, subtraction of two iterators if it cannot be implemented in constant time), and Boost is trying to be consistent with this design choice.
I agree that consistency of the library design is important, but the case is not completely trivial. IMO, this design principle is only a theoretical one. In practice, it is often unhelpful and counterproductive. Because of this principle, STL and C++ miss the big selling point.
Okay, let's see why...
An end user (who pays for and uses a software product) is interested in the best running time of an algorithm which is determined not only by costs of specific operations, but also by their frequencies. Relatively small number of inefficient operations does not affect the total computational cost of an algorithm.
Sometimes, yes, I agree. Also, this design principle makes illegal advanced and/or augmented data
structures which are beneficial for many complex algorithms. For example, insert() and erase() functions for a single element with O(log N) cost are not allowed by the C++ standard, however, significantly less efficient O(N) operations are allowed for std::vector.
I'm not really sure what to make of this paragraph. The standard makes some guarantees for std:: data structures, but you're free to provide your own augmented data structures with whatever complexity guarantees make sense, and I guarantee you wouldn't be violating any laws. I think that the current standard specifications for computational
complexities of operations are to some degree obsolete. They create unnecessary restrictions that will be either amended or removed completely in future versions of STL.
Strongly disagree, and you haven't provided any examples of these "unnecessary restrictions" (or you aren't being precise in what you mean). Much code and many algorithms rely on the constant-time random access of std::vector for their own complexity bounds; are you suggestion the standard should drop this complexity guarantee? - Jeff