My application is running on an embedded system that has no battery backed up real time clock. So when it starts, it's always 1-1-1970. The first time a client app connects, it synchronizes the time, so the system time will change to 2008. This has drastic consequences on this sort of action: void MyClass::TimerCallback(void) { // do important periodic activity here... myTimer->expires_at(myTimer->expires_at()+ boost::posix_time::milliseconds(100)); myTimer->async_wait(boost::bind(&MyClass::TimerCallback, this)); } Since the expires_at() values is absolute time, it still thinks it's 1970, moving forward 100mS at a time. This causes the timer to expire immediately, and my CPU load goes to 100% as it gets stuck in this repetition. This is not a bug on the part of deadline_timer - it's doing exactly what I told it to. The question is how should I change the code to cope with time changes. I see several options: 1) The API call to set the time on my embedded device is within my application, so I know when an outside party has changed the time. I could send a boost::signal to the rest of the application & wrote code in each module that has a deadline timer used in this way that resets the timer. 2) Before setting a new expires_at value, I could calculate the time period between now() and expires_at(). If it is more than double the periodic delay, then I use expires_from_now() to resync with the new absolute time. 3) I could calculate the periodic time based on absolute time: now() - (now() % milliseconds(100)) + milliseconds(100) This means it would always be aligned with the most recent 100mS time. 4) Only use expires_from_now(). This is the simplest solution, but there are parts of my application where it is preferable to have more accurate timer period. These approaches all have advantages and drawbacks. Does anyone have any comments or experience with this issue? Thanks, Jeff