PA Analysis
PA Analysis
Jan Zapletal
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 1 / 13
1 Contents
2 Introduction
3 Amdahl’s law
4 Gustafson’s law
5 Equivalence of laws
6 References
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 2 / 13
Performance analysis
How does the parallelization improve the performance of our program?
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 3 / 13
Performance analysis
How does the parallelization improve the performance of our program?
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 3 / 13
Metrics
Execution time
I The time elapsed from when the first processor starts the execution to
when the last processor completes it.
I On a parallel system consists of computation time, communication
time and idle time.
Speedup
I Defined as
T1
S= ,
Tp
where T1 is the execution time for a sequential system and Tp for the
parallel system.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 4 / 13
Metrics
Execution time
I The time elapsed from when the first processor starts the execution to
when the last processor completes it.
I On a parallel system consists of computation time, communication
time and idle time.
Speedup
I Defined as
T1
S= ,
Tp
where T1 is the execution time for a sequential system and Tp for the
parallel system.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 4 / 13
Amdahl’s law
Gene Myron Amdahl (born November 16, 1922)
I worked for IBM,
I best known for formulating Amdahl’s law uncovering the limits of
parallel computing.
Let T1 denote the computation time on a sequential system. We can split
the total time as follows
T1 = t s + t p ,
where
I ts - computation time needed for the sequential part.
I tp - computation time needed for the parallel part.
Clearly, if we parallelize the problem, only tp can be reduced. Assuming
ideal parallelization we get
tp
Tp = t s + ,
N
where
I N - number of processors.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 5 / 13
Amdahl’s law
Gene Myron Amdahl (born November 16, 1922)
I worked for IBM,
I best known for formulating Amdahl’s law uncovering the limits of
parallel computing.
Let T1 denote the computation time on a sequential system. We can split
the total time as follows
T1 = ts + tp ,
where
I ts - computation time needed for the sequential part.
I tp - computation time needed for the parallel part.
Clearly, if we parallelize the problem, only tp can be reduced. Assuming
ideal parallelization we get
tp
Tp = ts + ,
N
where
I N - number of processors.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 5 / 13
Amdahl’s law
Gene Myron Amdahl (born November 16, 1922)
I worked for IBM,
I best known for formulating Amdahl’s law uncovering the limits of
parallel computing.
Let T1 denote the computation time on a sequential system. We can split
the total time as follows
T1 = ts + tp ,
where
I ts - computation time needed for the sequential part.
I tp - computation time needed for the parallel part.
Clearly, if we parallelize the problem, only tp can be reduced. Assuming
ideal parallelization we get
tp
Tp = ts + ,
N
where
I N - number of processors.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 5 / 13
Amdahl’s law
Thus we get the speedup of
T1 ts + tp
S= = t .
Tp ts + Np
I Notice that Amdahl assumes the problem size does not change with
the number of CPUs.
I Wants to solve a fixed-size problem as quickly as possible.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 6 / 13
Amdahl’s law
Thus we get the speedup of
T1 ts + tp
S= = t .
Tp ts + Np
I Notice that Amdahl assumes the problem size does not change with
the number of CPUs.
I Wants to solve a fixed-size problem as quickly as possible.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 6 / 13
Amdahl’s law
Thus we get the speedup of
T1 ts + tp
S= = t .
Tp ts + Np
I Notice that Amdahl assumes the problem size does not change with
the number of CPUs.
I Wants to solve a fixed-size problem as quickly as possible.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 6 / 13
Amdahl’s law
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 7 / 13
Gustafson’s law
John L. Gustafson (born January 19, 1955)
I American computer scientist and businessman,
I found out that practital problems show much better speedup than
Amdahl predicted.
Gustafson’s law
I The computation time is constant (instead of the problem size),
I increasing number of CPUs ⇒ solve bigger problem and get better
results in the same time.
Let Tp denote the computation time on a parallel system. We can split the
total time as follows
Tp = ts∗ + tp∗ ,
where
I ts∗ - computation time needed for the sequential part.
I tp∗ - computation time needed for the parallel part.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 8 / 13
Gustafson’s law
John L. Gustafson (born January 19, 1955)
I American computer scientist and businessman,
I found out that practital problems show much better speedup than
Amdahl predicted.
Gustafson’s law
I The computation time is constant (instead of the problem size),
I increasing number of CPUs ⇒ solve bigger problem and get better
results in the same time.
Let Tp denote the computation time on a parallel system. We can split the
total time as follows
Tp = ts∗ + tp∗ ,
where
I ts∗ - computation time needed for the sequential part.
I tp∗ - computation time needed for the parallel part.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 8 / 13
Gustafson’s law
John L. Gustafson (born January 19, 1955)
I American computer scientist and businessman,
I found out that practital problems show much better speedup than
Amdahl predicted.
Gustafson’s law
I The computation time is constant (instead of the problem size),
I increasing number of CPUs ⇒ solve bigger problem and get better
results in the same time.
Let Tp denote the computation time on a parallel system. We can split the
total time as follows
Tp = ts∗ + tp∗ ,
where
I ts∗ - computation time needed for the sequential part.
I tp∗ - computation time needed for the parallel part.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 8 / 13
Gustafson’s law
On a sequential system we would get
T1 = ts∗ + N · tp∗ .
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 9 / 13
Gustafson’s law
On a sequential system we would get
T1 = ts∗ + N · tp∗ .
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 9 / 13
Gustafson’s law
On a sequential system we would get
T1 = ts∗ + N · tp∗ .
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 9 / 13
Gustafson’s law
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 10 / 13
What the hell?!
I The bigger the problem, the smaller f - serial part remains usualy the
same,
I and f 6= f ∗ .
Amdahl’s says:
ts + tp
S= t .
ts + Np
Let now f ∗ denote the sequential portion spent in the parallel
computation, i.e.
tp
ts
f∗ = tp and (1 − f ∗ ) = N
tp .
ts + N ts + N
Hence
∗ tp ∗ tp
ts = f · ts + and tp = N · (1 − f ) · ts + .
N N
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 11 / 13
What the hell?!
I The bigger the problem, the smaller f - serial part remains usualy the
same,
I and f 6= f ∗ .
Amdahl’s says:
ts + tp
S= t .
ts + Np
Let now f ∗ denote the sequential portion spent in the parallel
computation, i.e.
tp
ts
f∗ = tp and (1 − f ∗ ) = N
tp .
ts + N ts + N
Hence
∗ tp ∗ tp
ts = f · ts + and tp = N · (1 − f ) · ts + .
N N
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 11 / 13
What the hell?!
I The bigger the problem, the smaller f - serial part remains usualy the
same,
I and f 6= f ∗ .
Amdahl’s says:
ts + tp
S= t .
ts + Np
Let now f ∗ denote the sequential portion spent in the parallel
computation, i.e.
tp
ts
f∗ = tp and (1 − f ∗ ) = N
tp .
ts + N ts + N
Hence
∗ tp ∗ tp
ts = f · ts + and tp = N · (1 − f ) · ts + .
N N
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 11 / 13
I see!
I After substituting ts and tp into the Amdahl’s formula one gets
ts + tp ∗ ∗
S= t = f + N · (1 − f ),
ts + Np
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 12 / 13
I see!
I After substituting ts and tp into the Amdahl’s formula one gets
ts + tp ∗ ∗
S= t = f + N · (1 − f ),
ts + Np
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 12 / 13
I see!
I After substituting ts and tp into the Amdahl’s formula one gets
ts + tp ∗ ∗
S= t = f + N · (1 − f ),
ts + Np
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 12 / 13
References
QUINN, Michael Jay. Parallel programming in C with MPI and
OpenMP. New York : McGraw - Hill, 2004. 507 s.
Amdahl’s law [online]. Available at:
<http://en.wikipedia.org/wiki/Amdahl’s law>.
Gustafson’s law [online]. Available at:
<http://en.wikipedia.org/wiki/Gustafson’s law>.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 13 / 13
References
QUINN, Michael Jay. Parallel programming in C with MPI and
OpenMP. New York : McGraw - Hill, 2004. 507 s.
Amdahl’s law [online]. Available at:
<http://en.wikipedia.org/wiki/Amdahl’s law>.
Gustafson’s law [online]. Available at:
<http://en.wikipedia.org/wiki/Gustafson’s law>.
Jan Zapletal (VŠB-TUO) Amdahl’s and Gustafson’s laws November 23, 2009 13 / 13