
Marsh Ray <marsh@extendedsubset.com> wrote in news:4ED7BF7F.8050404@extendedsubset.com:
On 11/29/2011 08:31 PM, Kenneth Porter wrote:
10.0\release\asynch-exceptions-on\threading-multi \store_now_in_vector_static.output: steady_clock 636746989 nanoseconds
So about 650 nanoseconds using a 32-bit build on Vista 64 Ultimate.
Some of that may reflect quantization error.
E.g., the clock output might be truncated to microsecond precision which introduces a 500 ns error on average and the actual read overhead is something like 150 ns.
The profiling program reads the clock a million times, storing the results in a pre-allocated array, and times the whole operation. (An initial run that constructs a million time_points is used to factor out the loop and array member constructor time.) How would microsecond jitter affect the overall operation to that degree?