0% found this document useful (0 votes)
2 views

Unit-II part I

The document discusses the message passing paradigm, which facilitates communication and synchronization between processes in parallel systems. It covers the historical background, principles, advantages, and challenges of message passing programming, as well as the structure of message passing programs and basic operations like send and receive. Additionally, it introduces the Message Passing Interface (MPI) and its core functions, along with collective communication operations and their applications.

Uploaded by

nshreya09
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Unit-II part I

The document discusses the message passing paradigm, which facilitates communication and synchronization between processes in parallel systems. It covers the historical background, principles, advantages, and challenges of message passing programming, as well as the structure of message passing programs and basic operations like send and receive. Additionally, it introduces the Message Passing Interface (MPI) and its core functions, along with collective communication operations and their applications.

Uploaded by

nshreya09
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

Unit-II Part-I

Message Passing Paradigm


Introduction to Message Passing: Definition
and Importance
• Message passing facilitates communication
between processes in parallel systems.
• Enables explicit control of data exchange and
synchronization.
Introduction to Message Passing: Historical
Background
• Developed in the early days of parallel
computing.
• Adopted widely due to its simplicity and
hardware independence.
Principles of Message Passing
Programming: Key Attributes
• Partitioned address space: Each process
operates on its own data.
• Explicit parallelization: Programmer defines
the parallel tasks.
Principles of Message Passing Programming:
Advantages and Challenges
• Encourages locality of access, improving
performance.
• Complex for dynamic and unstructured
interactions.
Structure of Message Passing Programs:
Programming Paradigms
• Asynchronous: Tasks execute independently,
allowing full flexibility.
• Loosely synchronous: Tasks synchronize
periodically for interactions.
Structure of Message Passing Programs:
Single Program Multiple Data (SPMD) Model
• Code is identical for all processes with minor
differences in behavior.
• Commonly used for scalable parallel
programming.
Send and Receive Operations: Basic
Operations
• Send: Transmit data from one process to
another.
• Receive: Accept data transmitted by another
process.
Send and Receive Operations: Example
Code Snippet
• P0: send(&a, 1, P1);
• P1: receive(&a, 1, P0);
Blocking and Non-Blocking Operations:
Blocking Communication
• Operations block until data transfer is
complete.
• Ensures safe and consistent data transmission.
Blocking and Non-Blocking Operations: Non-
Blocking Communication
• Allows overlap of communication with
computation.
• Requires explicit management to ensure
correctness.
MPI: Message Passing Interface: Overview

• A portable, standardized message passing


system.
• Widely supported by academia and industry.
MPI: Message Passing Interface: Core
Functions
• MPI_Init and MPI_Finalize: Initialize and clean
up the environment.
• MPI_Send and MPI_Recv: Basic
communication primitives.
Collective Communication Operations:
Common Operations
• MPI_Barrier: Synchronize all processes.
• MPI_Bcast: Broadcast data from one process
to all others.
• MPI_Reduce: Aggregate data using operations
like sum or max.
Collective Communication Operations:
Example Applications
• Data sharing in parallel programs.
• Synchronization points in simulations.

You might also like