
On Sat, Oct 23, 2010 at 3:00 PM, Domagoj Saric <dsaritz@gmail.com> wrote:
"Daniel Walker" <daniel.j.walker@gmail.com> wrote in message news:AANLkTikXL0i5Z+NoO2a3TiB8q2RmTk4oHJ1kzOyR-7vE@mail.gmail.com...
Do you have any suggestion for how to quantify this? I've run some simple benchmarks, but in optimized object code the time overhead of boost::function is so small it's hard to measure.
Those benchmarks are obviously lacking...I've given you examples that show that the various overheads of the current implementation are quite measurable...
Sorry, I must have missed your examples. Here's a simple benchmark that we could use: template<class T> double benchmark(T f, int n) { assert(f); boost::timer t; for(int i = 0; i < n; ++i) f(i); return t.elapsed(); } int echo(int x) { return x; } int main() { int n = INT_MAX; boost::function<int(int)> f = &echo; double baseline = benchmark(&echo, n); double t = benchmark(f, n); assert(baseline < t); std::cout << "boost::function overhead = " << (t - baseline) / n << " seconds" << std::endl; return 0; } On my machine, this measures the overhead of boost::function as 2 nanoseconds, though, obviously, this is a statistical inference and quantities of time that small are hard to measure. Does anyone have other suggestions for how to benchmark boost::function? Daniel Walker