Dear Colleague, A bug in the parallel version of the Dissipative Particle Dynamics code in DL_MESO 2.4 has been discovered. For systems involving hard surfaces, no boundary halo data is required next to the surfaces: these boundaries are identified using the Boolean array srflgc for each subdomain. Therefore the export of particle data does not need to take place across boundaries containing hard surfaces. The export routines in domain_module.f90, however, depend upon each processor sending and receiving data to/from two different processors. The current implementation of omitting boundary halo data across hard surfaces would thus cause the program to hang due to orphaned MPI calls. To solve this problem, the exportdata, exportvelocitydata and exportdensitydata subroutines in domain_module.f90 need to be modified so the relevant export routines are called regardless of the existence of a surface, but still ensure that no particle data is sent across solid boundaries. To achieve this, the position range for each direction needs to be modified if a solid boundary exists so that none of the particles can be added to the boundary halo message buffer. By inserting the following lines immediately before each CALL statement in the three subroutines: IF (srflgc(1)) final = -rhalo IF (srflgc(2)) begin = sidex + rhalo IF (srflgc(3)) final = -rhalo IF (srflgc(4)) begin = sidey + rhalo IF (srflgc(5)) final = -rhalo IF (srflgc(6)) begin = sidez + rhalo the IF statements in the export routines will not be able to find a particle that qualifies for copying into a boundary halo. The IF statements should also be removed from the lines calling the export routines. For instance, in exportdata, the following line (1554) IF (.NOT. srflgc(1)) CALL export (nlimit, 1, map (1), begin, final, sidex) should be changed to IF (srflgc(1)) final = -rhalo CALL export (nlimit, 1, map (1), begin, final, sidex) It should be noted that the serial version of the module, domain_module_ser.f90, does not need to be modified as it does not rely on synchronized MPI calls. This bug fix has been applied to the current DL_MESO release and a corrected version may be downloaded by registered users of version 2.4 without re- registering and decrypted using the same password. Michael Seaton 2011-06-07