
Russell Hind <rhind@mac.com> writes:
Anthony Williams wrote:
On Windows, you could use GetTickCount to do the timing, with GetSystemTimeAsFileTime and SystemTimeToFileTime to get the limits --- e.g. const SYSTEMTIME suppliedEndTime=...; FILETIME endTime={0}; SystemTimeToFileTime(&suppliedEndTime,&endTime); FILETIME startTime={0}; GetSystemTimeAsFileTime(&startTime); DWORD const initialTick=GetTickCount(); ULONGLONG const diff=reinterpret_cast<ULARGE_INTEGER const&)(endTime).QuadPart- reinterpret_cast<ULARGE_INTEGER const&)(startTime).QuadPart; ULONGLONG const elapsedTicks=diff/10000; ASSERT(elapsedTicks<=ULONG_MAX); while((GetTickCount()-initialTick)<elapsedTicks) { doStuff(); }
TickCount can wrap around, though (I think its something like 47.9 days or something like that).
Yes, that's what the ASSERT checks. How often do you schedule a software event for 40+ days into the future?
There is also QueryPerformanceCounter for high resolution tick count, QueryPerformanceFrequency will give you the frequency of the counter as it will vary from system to system.
Yes. I expect that the wrap-around for that is longer, since it is 64-bit. Even if it is set to the processor speed, on a 4Ghz CPU you would still have 2^32 seconds, which is a good few years. I can't imagine there are any systems where the resolution is better than the CPU frequency (what good would it do?). There doesn't seem to be any comment about the minimum resolution, but I guess it should be at least as good as GetTickCount, otherwise it wouldn't qualify for High Performance. It's slightly more work to use due to the system variability, but not a lot. Anthony -- Anthony Williams Senior Software Engineer, Beran Instruments Ltd.