Skip main navigation

New lower prices! Get up to 50% off 1000s of courses. 

Explore courses

Week 4 summary

Short summary of week four.
We hope that you have enjoyed the fourth, and final, week of Python in High Performance Computing!

This week, we have looked into parallel computing using MPI for Python. With MPI communication one is able to distribute the work to multiple CPU cores and to take care of the necessary data synchronisations while executing a parallel algorithm.

By now, you should know how to send and receive MPI messages, how to use collective communication, and how to create your own custom communicators. You should also be familiar with the key concepts of parallel computing and understand the execution and data model of MPI.

To learn more about MPI or parallel programming in general, please consider looking at the other training courses offered by PRACE: http://www.training.prace-ri.eu/

© CC-BY-NC-SA 4.0 by CSC - IT Center for Science Ltd.
This article is from the free online

Python in High Performance Computing

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now