Thanks, I just posted irecv before isend. master node: request=irecv(); do its local work; isend(message to worker nodes);l wait(request). worker node: while(stil have new task ){ recv(message); do its local work; isend(result message to master)} if there is only one task to worker, it works. But, if there are 2 tasks to workers, master cannot get the result from worker. the master node always wait for the results from workers, it seems that master misses the results when there are more than one task for worker to run. Any help is appreciated. Jack
Date: Mon, 28 Jun 2010 16:46:56 +0200 From: riccardo.murri@gmail.com To: boost-users@lists.boost.org Subject: Re: [Boost-users] boostMPI asychronous communication
Hi Jack,
On Mon, Jun 28, 2010 at 4:00 PM, Jack Bryan
wrote: MPI_irecv() ; do other works; MPI_wait(); But, my message receiver is much slower than sender. when the receiver is doing its local works, the sender has sent out their messages. but at this time, the receiver is very busy doing its local work and cannot post MPI_irecv to get the messages from senders.
If you know what messages the receiver is going to receive, you can post your irecv() *before* starting the compute-intensive loop.
If you can't post the irecv() before the busy loop, MPI will buffer messages for you, up to some implementation-defined limit: "Send of all modes [...] can be started whether a matching receive has been posted or not [...] If the call causes some system resources to be exhausted, then it will fail and return an error code." (MPI 2.1 spec, sec 3.7, page 48 of the printed edition)
You might be able to get better help on an MPI-specific forum.
Best regards, Riccardo _______________________________________________ Boost-users mailing list Boost-users@lists.boost.org http://lists.boost.org/mailman/listinfo.cgi/boost-users
_________________________________________________________________ The New Busy is not the old busy. Search, chat and e-mail from your inbox. http://www.windowslive.com/campaign/thenewbusy?ocid=PID28326::T:WLMTAGL:ON:W...