Yes, it is slightly more work than if there was a state_machine::is_processing() function. I'm just not sure whether your use case is sufficiently common to justify adding such a function. Maybe you could provide more details so that I understand it better?
Sure, I'll try. My state machine is coupled to a connection on which I must comply with some protocol. My layer is a middleware. Each time I have something interesting, like data or events, I call a user-supplied handler on a corresponding method from a react method of the state machine. If the user wants to close the connection I have the problem I described because: - either he decides to close the connection from a callback, which means the state machine is in the middle of processing the event which caused the call to the handler, then I must use post_event - or he decided to close the connection based on an event that didn't come from the connection, in that case I must use process_event. I simplified the problem but the idea is there. If my state machine calls a handler on which I don't have total control, and I supply a few methods that need to be processed using the state machine, then I need to decide between process and post_event. In the simple case I could also provide 2 close methods for the handler, one that must be called in callbacks and one for the other methods of handler. I don't think this is a clean way to solve this issue. Each method in my interface would have 2 versions, and the stability becomes really weak because calling the wrong method at the wrong place results in a crash. At this point, maybe my design is weak, and I shouldn't call the handlers callback directly from the react methods. Instead I could use an event queue to talk to the handler. In the react method I would post events to the queue, and make sure I dequeue it from outside the state machine. But it involves (much) more work... Hope my explanations are clear ;) What is your point of view on this ? Regards, Philippe