
On Thu, Mar 17, 2011 at 3:22 PM, Ted Byers
From: boost-users-bounces@lists.boost.org [mailto:boost-users-bounces@lists.boost.org] On Behalf Of Emil Dotchevski Sent: March-17-11 5:27 PM On Thu, Mar 17, 2011 at 1:58 PM, Ted Byers
wrote: I do have a little sympathy with your, and his, position, when dealing with extremely tight time constraints, but not a lot. If one of the design criteria is that the code being produced must be widely portable, then I pass it through the range of platforms and compilers that we have to support, and as far as it is possible, I try to treat all warnings as errors.
The virtual destructor warning goes directly against a conscious design decision. In my opinion it also teaches programmers a bad habit. This doesn't make the warning any less annoying of course, so I'm doing my best to suppress it.
I would like to understand why making destructors virtual when there are other virtual member functions might be seen as a bad habit.
OK I'm exaggerating, obviously there are worse habits one can be taught. :) A public virtual destructor lets anyone call delete as they please. In any non-trivial program this isn't a good thing.
What, precisely, was that conscious design decision, and what is the rationale for it?
"The error_info_base class does not support deleting objects of derived types polymorphically" (and virtual is used to indicate that a particular operation is polymorphic, which would be misleading in this case.)
It is my experience that when two experienced developers disagree about a practice, it is born of differences in the nature of the problems they have faced in the past, and the information they have at their disposal. If you were to look at the applications I develop, you'd find very few objects created on the stack. Almost everything goes on the heap, managed by the most appropriate of the boost smart pointers. In my environmental modelling software, for example, the application starts off with almost nothing in the heap, but as the user builds the model, he may end up producing hundreds or even thousands of instances of sometimes complex UDTs, and these UDTs are often drawn from complex inheritance trees (but almost never involving multiple inheritance ;-). Connections between these instances can often be quite complex, so there is, in the base class, a function that breaks all connections among the objects before any attempt is made to delete anything. Because the number of UDTs is quite large, and there is a common modelling interface for which there are virtual functions (as pure virtual functions in the base class), all these objects are managed in a single std::vector containing smart pointers having pointers to the base class.
What is the reason for not storing shared_ptrs in that std::vector? Wouldn't a protected and non-virtual destructor be more appropriate in that case? Emil Dotchevski Reverge Studios, Inc. http://www.revergestudios.com/reblog/index.php?n=ReCode