
On Tue, Jul 8, 2008 at 9:25 AM, Phil Endecott < spam_from_boost_dev@chezphil.org> wrote:
Steven Ross wrote:
- Use new and delete rather than *alloc and free.
- Can any exceptions occur? If so, is the dynamically allocated memory leaked? Could you use a smart pointer to avoid this?
I was referring to exceptions in general between allocation and de-allocation, not just during allocation itself.
I guess the user-defined types could have exceptions in their >> and < methods, so I'll have to make it exception-safe. Good point.
Using *alloc, they should return NULL if allocation failed, and that
situation is handled. With new and delete I'll have to catch memory exceptions
See new(nothrow).
Right, thanks for the reminder. That + smart pointers should make sure there are no memory leaks with an exception. I could use a try-catch, but std::sort doesn't, so I'll just let the user catch exceptions from their own operator overloads.
- Does it work with uint64_t? I see some variables declared as 'long',
which often can't store a uint64_t.
There exists the problem of how to identify and deal with negatives. I try to avoid forcing any particular return data type
Why not just use Iterator::value_type (or whatever it's called; or some iterator_traits thing) everywhere?
They are already divided
by 512, so an unsigned should fit inside a signed data type without trouble.
Dividing a 64-bit value by 512 does not make it small enough to fit in a 32-bit long. (Have I misunderstood you?)
I was suggesting using an int64_t, which will hold a 64-bit value, so yes, there was a misunderstanding. The problem is that I need to use the return type of the user's >> method, and with the same code supporting any-size data and both signed and unsigned integers, I need a data type that can support all the different possibilities. Is there some way I can grab the return type of the user's >> method and use it directly? Otherwise, I think int64_t should work as long as it's fast until people start using 128-bit values.