Book Image

Python Parallel Programming Cookbook - Second Edition

By : Giancarlo Zaccone
Book Image

Python Parallel Programming Cookbook - Second Edition

By: Giancarlo Zaccone

Overview of this book

<p>Nowadays, it has become extremely important for programmers to understand the link between the software and the parallel nature of their hardware so that their programs run efficiently on computer architectures. Applications based on parallel programming are fast, robust, and easily scalable. </p><p> </p><p>This updated edition features cutting-edge techniques for building effective concurrent applications in Python 3.7. The book introduces parallel programming architectures and covers the fundamental recipes for thread-based and process-based parallelism. You'll learn about mutex, semaphores, locks, queues exploiting the threading, and multiprocessing modules, all of which are basic tools to build parallel applications. Recipes on MPI programming will help you to synchronize processes using the fundamental message passing techniques with mpi4py. Furthermore, you'll get to grips with asynchronous programming and how to use the power of the GPU with PyCUDA and PyOpenCL frameworks. Finally, you'll explore how to design distributed computing systems with Celery and architect Python apps on the cloud using PythonAnywhere, Docker, and serverless applications. </p><p> </p><p>By the end of this book, you will be confident in building concurrent and high-performing applications in Python.</p>
Table of Contents (16 chapters)
Title Page
Dedication

Collective communication using a broadcast

During the development of parallel code, we often find ourselves in a situation where we must share, between multiple processes, the value of a certain variable at runtime or certain operations on variables that each process provides (presumably with different values).

To resolve these types of situations, communication trees are used (for example, process 0 sends data to the processes 1 and 2, which will, respectively, take care of sending them to processes 3, 4, 5, 6, and so on).

Instead, MPI libraries provide functions that are ideal for the exchange of information or the use of multiple processes that are clearly optimized for the machine in which they are performed:

Broadcasting data from process 0 to processes 1, 2, 3, and 4

A communication method that involves all the processes that belong to a communicator is called...