
Caleb Epstein <caleb.epstein <at> gmail.com> writes:
As far as the appropriate subseconds type goes, we should probably pick the highest-possible resolution that makes sense, which I'd contend is probably microseconds. Some operating systems may be able to slice time (and signal events) at resolutions below milliseconds, but I doubt any can go deeper than microseconds.
I wouldn't take that bet. I know Mac OS X can measure time as finely as nanoseconds (but I have no idea how many services, i.e. sockets, actually work at nanosecond resolutions; it doesn't seem outside the realm of possibility that, given the way technologies advance, that within a few short years, microseconds simply won't be fine enough. One of the nice things about double-as-time-unit is that it avoids resolution issues altogether. Bob