Lecture 14 Basic Communication Operations.pptx
Lecture 14 Basic Communication Operations.pptx
• Recursive doubling
Source process sends the massage to another process
In next communication phase both the processes can simultaneously
propagate the message
One-to-All Broadcast and All-to-One
Reduction
Linear Array or Ring (One Dimensional Structure)
• Recursive Doubling Broadcast
Select destination such that no congestions occur
One-to-All Broadcast and All-to-One
Reduction
Linear Array or Ring (One Dimensional Structure)
• Recursive Doubling Reduction
Odd will send message to even and associative operator will combine the results
Matrix-Vector Multiplication (An
Application)
Mesh Broadcast and Reduction
• We can regard each row and column of a square mesh of p nodes as
a linear array of nodes
• Communication algorithms on the mesh are simple extensions of
their linear array counterparts
All-to-All Reduction
• Dual of all-to-all broadcast
• Each node is the destination of an all-to-one reduction out of total P
reductions
• Applications in different matrix operations including matrix
multiplication and matrix-vector multiplication
All-to-All Broadcast and All-to-All Reduction
(Cont.)
All-to-One Reduction - The m-sized data from all processes are combined
through an associative operator, accumulated at a single destination process
into one buffer of size m
✔ Linear Array or Ring
✔ Matrix-Vector Multiplication (An Application)
✔ Mesh Broadcast and Reduction
✔ Balanced Binary Tree
✔ Hypercube Broadcast (3-D Structure)
✔ Cost Estimation
Summary (Cont.)
•All-to-All Broadcast and All-to-All Reduction
All-to-All Broadcast - Every process broadcasts m-word message
•Exercise
Additional Resources
•Introduction to Parallel Computing by Ananth Grama and
Anshul Gupta