Skip main navigation

Hands-on: Using C-functions

In this exercise you can practice using C-functions in Cython modules. The code for this exercise is located under cython/c-functions. Fibonacci numbers are a sequence of integers defined by the …

Hands-on: Static typing in a simple extension

In this exercise you can practice using static type declarations in Cython modules. The code for this exercise is located under cython/static-typing. Continue with the simple Cython module for subtracting …

Bonus hands-on: Parallel heat equation solver

If you would like to further practice your parallel programming skills, we have as a final parallel programming hands-on exercise parallelization of the heat equation solver with MPI. Source code …

Week 4 summary

We hope that you have enjoyed the fourth, and final, week of Python in High Performance Computing! This week, we have looked into parallel computing using MPI for Python. With …

Hands-on: Collective operations

In this exercise we test different routines for collective communication. Source code for this exercise is located in mpi/collectives/ First, write a program where rank 0 sends an array containing …

Communicators

In MPI context, a communicator is a special object representing a group of processes that participate in communication. When a MPI routine is called, the communication will involve some or …

Hands-on: Non-blocking communication

In this exercise we explore non-blocking communication Source code for this exercise is located in mpi/non-blocking/ Go back to the Message chain exercise and implement it using non-blocking communication.

What is non-blocking communication?

When communication routines are blocking, it means the programme is stuck waiting as long as communication is taking place. Blocking routines will exit only once it is safe to access …

Hands-on: Message chain

In this exercise we explore a typical communication pattern, one-dimensional acyclic chain. Source code for this exercise is located in mpi/message-chain/ Write a simple program where every MPI task sends …

Hands-on: Message exchange

In this exercise we study sending and receiving data between two MPI process. Source code for this exercise is located in mpi/message-exchange/ Communicating general Python objects Write a simple program …

Fast communication of large arrays

MPI for Python offers very convenient and flexible routines for sending and receiving general Python objects. Unfortunately, this flexibility comes with a cost in performance. In practice, what happens under …

What is point-to-point communication in MPI?

Since MPI processes are independent, in order to coordinate work, they need to communicate by explicitly sending and receiving messages. There are two types of communication in MPI: point-to-point communication …