Skip main navigation

One-sided communication: How does it work?

In this article we will point out the differences between one-sided communication and two-sided communication. At the same time, you will learn some new terms...
© HLRS

In this article we will point out the differences between one-sided communication and two-sided communication. At the same time, you will learn some new terms used in one-sided communication.

Let’s take a brief glance at two-sided communication before we look at one-sided communication:

In two-sided communication you have senders and receivers. Both processes are active peers in communication. Each side has to call a communication routine: the sender calls MPI_Send and the receiving process calls MPI_Receive. Each process can act as sender and as receiver.

Sender and receiver

Figure 1

In one-sided communication only one process is active: the so-called origin process. In the diagram below you can see process 0 taking action as the origin process. With MPI_Put process 0 sends data to the so-called target process, process 1, which receives this data without calling any receiving routine. This means that the execution of a put operation is similar to the execution of a send by the origin process and a matching receive by the target process. All arguments are provided by one call executed by the origin process.

Origin and target process

Figure 2

If the origin process wants to get data from the target process, it calls MPI_Get. It works similar to MPI_Put, except that the direction of data transfer is reversed. MPI_Get is therefore equivalent to the execution of a send by the target process and a corresponding receive by the origin process.

MPI_Put and MPI_Get are called Remote Memory Access (RMA) operations.

In the diagram above you can see the new term window. A window is a memory space which can be seen and accessed by other processes. Later in the course we will explain how to create and allocate windows within the communicator.

 

Typically all processes are both origin and target

 

Each process can act as an origin and target process. Their interaction is shown in the following sequence of figures, to give you a first impression of how one-sided operations are embedded into MPI processes:

Four MPI processes

Figure 3

As indicated previously, all processes may be both origin and target. Here we see an example with four MPI processes.

There is a full protection of the memory of each MPI process against access from outside: that is, from other MPI processes.

Two-sided communication

Figure 4

Here is an example of two-sided communication (normal MPI_Send and MPI_Recv). One process calls MPI_Send, which reads out the data from the local send buffer. The corresponding MPI_Recv in the other process then stores the data in the local receive buffer. It is the job of the MPI library to transport the data from the sending process to the receiving process.

One-sided communication: windown creation

Figure 5

Now we will look at one-sided communication:

Each process has to specify a memory portion that should be accessible from the outside. With a collective call to MPI_Win_create, all processes make their windows accessible from the outside.

One-sided communication: MPI_Put and MPI_Get

Figure 6

In this example then we can use MPI_Put to store data from a local send buffer into the window of a remote process. Or we can use MPI_Get to fetch data from a remote window, and then store it into the local buffer.

You can see that there is no Remote Memory Access call on the target window process.

The origin side is the only process that must call the RMA routines (MPI_Put and MPI_Get).

Windows are peepholes into process memory
Figure 7

In this sense, windows are peepholes into the memory of the MPI processes.

With RMA routines an origin process can put data into remote windows or can get data out from the remote windows.

Congratulations to those who have learned some new terms: origin process, target process and also window.

© HLRS
This article is from the free online

One-Sided Communication and the MPI Shared Memory Interface

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now