0% found this document useful (0 votes)
38 views

CPNM Lecture 14 Solutions To Linear Simultaneous Equations

The document discusses methods for solving linear systems of equations. It describes naive Gaussian elimination, which involves transforming the system into an upper triangular system using row operations, then solving the system using back substitution. The steps are: 1) eliminate variables from lower rows using pivot rows; 2) repeat until an upper triangular system is obtained; 3) substitute values back into the equations starting from the last row. Naive Gaussian elimination yields an exact solution if done without rounding errors.

Uploaded by

Saptarshi Kundu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views

CPNM Lecture 14 Solutions To Linear Simultaneous Equations

The document discusses methods for solving linear systems of equations. It describes naive Gaussian elimination, which involves transforming the system into an upper triangular system using row operations, then solving the system using back substitution. The steps are: 1) eliminate variables from lower rows using pivot rows; 2) repeat until an upper triangular system is obtained; 3) substitute values back into the equations starting from the last row. Naive Gaussian elimination yields an exact solution if done without rounding errors.

Uploaded by

Saptarshi Kundu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

CPNM Lecture 14 - Solutions to Linear Simultaneous

Equations

Mridul Sankar Barik

Jadavpur University

2023

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 1 / 34
Linear Systems I

A linear equation in variables x1 , x2 , . . . , xn is an equation of the form

a1 x1 + a2 x2 + . . . + an xn = b

where a1 , a2 , . . . , an and b are constant real numbers. The constant


ai is called the coefficient of xi ; and b is called the constant term of
the equation
A system of linear equations (or linear system) is a finite collection
of linear equations in same set of variables. For instance, a linear
system of n equations in n variables x1 , x2 , . . . , xn can be written as


 a11 x1 + a12 x2 + . . . + a1n xn = b1
a21 x1 + a22 x2 + . . . + a2n xn = b2


.. (1)


 .
an1 x1 + an2 x2 + . . . + ann xn = bn

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 2 / 34
Linear Systems II
The system of linear equations can be written in matrix form

AX = B (2)

where,
     
a11 a12 a13 . . . a1n x1 b1
a21 a22 a23 . . . a2n  x2   b2 
A= . ..  , X =  ..  , and B =  .. 
     
.. .. ..
 .. . . . .   .  .
an1 an2 an3 . . . ann xn bn

A solution of a linear system is a tuple (s1 , s2 , . . . , sn ) of numbers


that makes each equation a true statement when the values
s1 , s2 , . . . , sn are substituted for x1 , x2 , . . . , xn respectively
The set of all solutions of a linear system is called the solution set of
the system
Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 3 / 34
Solution of Linear Systems - Direct Methods I

Yield exact solution in a finite number of arithmetic operations in


absence of round-off errors
In practice, we have finite number significant digits, so direct methods
cannot lead to exact solutions
Errors sometimes may lead to poor or even useless solutions
Examples: Naive Gauss Elimination, Gauss-Jordon Elimination

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 4 / 34
Naive Gaussian Elimination I

Reduces the system of equations to an equivalent upper triangular


system which is then solved by back substitution
The augmented matrix of the general linear system of equation 1 is
 
a11 a12 . . . a1n b1
 a21 a22 . . . a2n b2 
(3)
 
 .. 
 . 
an1 an2 . . . ann bn

The coefficient matrix of equation 1 is


 
a11 a12 . . . a1n
 a21 a22 . . . a2n 
(4)
 
 .. 
 . 
an1 an2 . . . ann
Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 5 / 34
Naive Gaussian Elimination II

Forward Elimination of Unknowns


Reduce the set of equations to an upper triangular system
Eliminate the first unknown, x1 , from the second through the nth
equations
Multiply first row by a21 /a11 and subtract it from second row
Multiply first row by a31 /a11 and subtract it from third row
...
We get the following
 
a11 a12 ... a1n b1
′ ′

 a22 ... a2n b2′ 

 .. 
 . 
′ ′
an2 ... ann bn′

a11 is called the pivot element


Repeat the above to eliminate the second unknown x2 from third row
onwards

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 6 / 34
Naive Gaussian Elimination III

After n − 1 iterations we get to an upper triangular matrix


 
a11 a12 . . . a1n b1
1 1
 a22 . . . a2n b21 
(5)
 
 .. 
 . 
n−1
. . . ann bnn−1

Back Substitution
bnn−1
Last row can be solved as xn = n−1
ann
The result can be back-substituted into the (n − 1)th row to solve for
n−2 n−2
xn−1 = (bn−1 − an−1,n xn )/an−1,n−1
...
n
P
x1 = (b1 − a1j xj )/a11
j=2

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 7 / 34
Naive Gaussian Elimination IV

General formula for obtaining the x’s


n
(i−1)
aiji−1 xj
P
bi −
j=i+1
xi = (i−1)
for i = n − 1, n − 2, . . . 1 (6)
aii

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 8 / 34
Naive Gaussian Elimination V

Drawbacks
Division by Zero
During both elimination and back-substitution phase division by zero
may occur
Pivoting technique partially avoids these problem
Round-Off Errors
Occurs due to limited significant digits
Ill-Conditioned Systems
Small changes in coefficients result in large changes in the solution
Implication ⇒ wide range of answers can approximately satisfy the
equations
Singular Systems
Determinant of a singular system is zero
After elimination stage the algorithm must check whether a zero
diagonal element is created; if so, abort

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 9 / 34
Example of Gaussian Elimination I
Use Gaussian Elimination to solve

2x + y + z = 10
3x + 2y + 3z = 18
x + 4y + 9z = 16

The corresponding augmented matrix is


 
2 1 1 10
 3 2 3 18 
1 4 9 16
Eliminating first variable x from equation 2 and 3 by performing
transformations [R2 − 32 R1 ] and [R3 − 21 R1 ]
 
2 1 1 10
 0 1 3 3 
2 2
0 72 172 11
Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 10 / 34
Example of Gaussian Elimination II

Eliminating second variable y from equation 3 by performing


7
transformations [R3 − 21 R2 ], we get the upper triangular form
2
 
2 1 1 10
 0 1 3 3 
2 2
0 0 −2 −10
By backward substitution we get z = 5, y = −9 and x = 7

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 11 / 34
Gaussian Elimination Algorithm I

Algorithm 1 Triangularization of n Equations in n Unknowns (Forward


Elimination)
1: for k = 1 to n − 1 in steps of 1 do
2: for j = k + 1 to n in steps of 1 do
3: u = a[j][k]/a[k][k]
4: for i = k to n + 1 in steps of 1 do
5: a[j][i] = a[j][i] − u ∗ a[k][i]
6: end for
7: end for
8: end for

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 12 / 34
Gaussian Elimination Algorithm II

Algorithm 2 Backward Substitution


1: x[n] = a[n][n + 1]/a[n][n]
2: for i = n − 1 to 1 in steps of -1 do
3: sum = 0
4: for j = i + 1 to n in steps of 1 do
5: sum = sum + a[i][j] ∗ x[j]
6: end for
7: x[i] = (a[i][n + 1] − sum)/a[i][i]
8: end for

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 13 / 34
Gauss-Jordon Elimination I

A variant of Gauss elimination


When an unknown is eliminated, it is eliminated from all other
equations rather than just the subsequent ones
All rows are normalized by dividing them by their pivot elements
The elimination step results in an identity matrix rather than a
triangular matrix ⇒ so, back substitution is not necessary
All pitfalls and improvements in Gauss elimination also applies to the
Gauss-Jordan method
Row Echelon Form: A matrix A is said to be in row echelon form if
the following conditions hold
1 All of the rows containing nonzero entries sit above any rows whose
entries are all zero
2 The first nonzero entry of any row, called the leading entry of that row,
is positioned to the right of the leading entry of the row above it

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 14 / 34
Gauss-Jordon Elimination II

Reduced Row Echelon Form: A matrix A is said to be in reduced


row echelon form if it is in row echelon form, and additionally it
satisfies the following two properties:
1 In any given nonzero row, the leading entry is equal to 1
2 The leading entries are the only nonzero entries in their columns
An augmented matrix in reduced row echelon form corresponds to a
solution to the corresponding linear system

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 15 / 34
Gauss-Jordon Elimination III
Algorithm 3 Gauss-Jordon Method
1: for i = 1 to n in steps of 1 do
2: j =i
3: while a[i][i] == 0 & j ≤ n do
4: Interchange i and (j + 1)th row of matrix a
5: j =j +1
6: end while
7: f = a[i][i]
8: for k = i to n + 1 in steps of 1 do
9: a[i][k] = a[i][k]/f
10: end for
11: for k = 1 to n in steps of 1 do
12: if k ̸= i then
13: f = a[k][i]/a[i][i]
14: for p = i to n + 1 in steps of 1 do
15: a[k][p] = a[k][p] − f ∗ a[i][p]
16: end for
17: end if
18: end for
19: end for

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 16 / 34
Example of Gauss-Jordon Elimination I
Use Gauss-Jordon Elimination to solve

x +y +z =5
2x + 3y + 5z = 8
4x + 5z = 2

The corresponding augmented matrix is


 
1 1 1 5
 2 3 5 8 
4 0 5 2
R1
Dividing R1 by it’s pivot element a11 = 1 or [R1 ← 1 ]
 
1 1 1 5
 2 3 5 8 
4 0 5 2
Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 17 / 34
Example of Gauss-Jordon Elimination II

Eliminating first variable x from equation 2 and 3 by performing


transformations [R2 ← R2 − 2R1 ] and [R3 ← R3 − 4R1 ]
 
1 1 1 5
 0 1 3 −2 
0 −4 1 −18
R2
Dividing R2 by it’s pivot element a22 = 1 or [R2 ← 1 ]
 
1 1 1 5
 0 1 3 −2 
0 −4 1 −18
Eliminating second variable y from equation 1 and 3 by performing
transformations [R1 ← R1 − R2 ] and [R3 ← R3 − (−4)R2 ]

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 18 / 34
Example of Gauss-Jordon Elimination III

 
1 0 −2 7
 0 1 3 −2 
0 0 13 −26
R3
Dividing R3 by it’s pivot element a33 = 13 or [R3 ← 13 ]
 
1 0 −2 7
 0 1 3 −2 
0 0 1 −2
Eliminating third variable z from equation 1 and 2 by performing
transformations [R1 ← R1 − (−2)R3 ] and [R2 ← R2 − 3R2 ]
 
1 0 0 3
 0 1 0 4 
0 0 1 −2

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 19 / 34
Example of Gauss-Jordon Elimination IV

Now, we directly get the solution as x = 3, y = 4 and z = −2

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 20 / 34
Matrix Inversion Using Gauss-Jordon Elimination I

Let A be an invertible n × n matrix


Suppose that a sequence of elementary row-operations reduces A to
the identity matrix
Then the same sequence of elementary row-operations when applied
to the identity matrix yields A−1
Apply the Gauss-Jordan method to the matrix [A In ]
Suppose the row reduced echelon form of the matrix [A In ] is [B C ]
If B = In , then A−1 = C or else A is not invertible

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 21 / 34
Solution of Linear Systems - Iterative Method I

Iterative methods start with an approximation to the true solution


and if convergent derive a sequence of closer approximations till the
required accuracy is obtained
Amount of computation is dependent on the accuracy required
Let the system of linear equations be given by


 a11 x1 + a12 x2 + a13 x3 + . . . + a1n xn = b1
 a21 x1 + a22 x2 + a23 x3 + . . . + a2n xn = b2

.. (7)


 .
an1 x1 + an2 x2 + an3 x3 + . . . + ann xn = bn

We assume the diagonal elements (aii ) to be non zero


If not, then the equations should be rearranged

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 22 / 34
Solution of Linear Systems - Iterative Method II
We can rewrite the equations as


 x1 = ab111 − aa11
12
x2 − a13
a11 x3 − ... − a1n
a11 xn
b a a23 a2n

x2 = a222 − a22 21
x1 − a22 x3 − ... − a22 xn


.. (8)

 .
an,(n−1)

bn an1 an2
− ann x1 − ann x2 − . . . −

 x =
n ann ann xn−1
(1) (1) (1) (1)
Suppose the vector X = [x1 , x2 , x3 , . . . , xn ] be a first
approximation to the unknowns x1 , x2 , x3 , . . . , xn
So, the second approximation is obtained as
(2) (1) 13 (1) (1)


 x1 = ab111 − aa12 x − aa11
11 2
x3 − . . . − aa1n
11
xn
 x (2) = b2 − a21 x (1) − a23 x (1) − . . . − a2n xn(1)


2 a22 a22 1 a22 3 a22
.. (9)


 .
 (2)

bn an1 (1) an2 (1) a (1)
xn = ann − ann x1 − ann x2 − . . . − n,(n−1) ann xn−1
Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 23 / 34
Solution of Linear Systems - Iterative Method III

If we write equation 9 in the matrix form X = BX + C then the


iteration formula may be written as X (r +1) = BX (r ) + C
In actual computation, solution vector X (r +1) is obtained element wise

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 24 / 34
Jacobi’s Method I

The iterative formula for the computation of solution by Jacobi’s


method is
 
n
(r +1) (r )
X
xi = − aij xj + bi  /aii for i = 1, 2, 3, . . . , n (10)
j=1,j̸=i

provided aii ̸= 0
Also known as method of simultaneous displacements

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 25 / 34
Jacobi’s Method II

Algorithm 4: Jacobi’s Method

input a → augmented matrix of order n × (n + 1), e → allowed relative error in the result,
maxit → the maximum number of iterations
output x → solution vector
1: for i = 1 to n in steps of 1 do
2: x[i] = 0
3: end for
4: for iter = 1 to maxit in steps of 1 do
5: big = 0
6: for i = 1 to n in steps of 1 do
7: sum = 0
8: for j = 1 to n in steps of 1 do
9: if j ̸= i then
10: sum = sum + a[i][j] ∗ x[j]
11: end if
12: end for
13: temp = (a[i][n + 1] − sum)/a[i][i]

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 26 / 34
Jacobi’s Method III
14: relerror =| (x[i] − temp)/temp |
15: if relerror > big then
16: big = relerror
17: end if
18: x ′ [i] = temp
19: end for
20: for i = 1 to n in steps of 1 do
21: x[i] = x ′ [i]
22: end for
23: if big ≤ e then
24: Write ”Converges to a solution”
25: Stop
26: end if
27: end for
28: Write ”Does not converge in maxit number of iterations”

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 27 / 34
Jacobi’s Method IV

The Jacobi iterative method works fine with well-conditioned linear systems
If the linear system is ill-conditioned, it is most probably that the Jacobi
method will fail to converge
The Jacobi method can generally be used for solving linear systems in which
the coefficient matrix is diagonally dominant
For each row, the absolute value of the diagonal term is greater than the
sum of absolute values of other terms

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 28 / 34
Gauss-Seidel Method I

Improves Jacobi’s method (faster convergence) by a simple


modification
Uses an improved component as soon as it is available
Also known as method of successive displacements
The iterative formula for the computation of solution by Gauss Seidel
method is
 
i−1 n
(r +1) (r +1) (r )
X X
xi = − aij xj − aij xj + bi  /aii for i = 1, 2, 3, . . . , n
j=1 j=i+1
(11)
provided aii ̸= 0

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 29 / 34
Gauss-Seidel Method II

Algorithm 5: Gauss-Seidel Method

input a → augmented matrix of order n × (n + 1), e → allowed relative error in the result,
maxit → the maximum number of iterations
output x → solution vector
1: for i = 1 to n in steps of 1 do
2: x[i] = 0
3: end for
4: for iter = 1 to maxit in steps of 1 do
5: big = 0
6: for i = 1 to n in steps of 1 do
7: sum = 0
8: for j = 1 to n in steps of 1 do
9: if j ̸= i then
10: sum = sum + a[i][j] ∗ x[j]
11: end if
12: end for
13: temp = (a[i][n + 1] − sum)/a[i][i]

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 30 / 34
Gauss-Seidel Method III

14: relerror =| (x[i] − temp)/temp |


15: if relerror > big then
16: big = relerror
17: end if
18: x[i] = temp
19: end for
20: if big ≤ e then
21: Write ”Converges to a solution”
22: Stop
23: end if
24: end for
25: Write ”Does not converge in maxit number of iterations”

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 31 / 34
Example of Iterative Method I
Use Jacobi’s / Gauss-Seidel Method to solve

10x1 − 2x2 − x3 − x4 = 3
−2x1 + 10x2 − x3 − x4 = 15
−x1 − x2 + 10x3 2x4 = 27
−x1 − x2 − 2x3 + 10x4 = −9

We rewrite the equations as

x1 = 0.3 + 0.2x2 + 0.1x3 + 0.1x4


x2 = 1.5 + 0.2x1 + 0.1x3 + 0.1x4
x3 = 2.7 + 0.1x1 + 0.1x2 + 0.2x4
x4 = −0.9 + 0.1x1 + 0.1x2 + 0.2x3

Initial solution vector x = [0, 0, 0, 0]


Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 32 / 34
Example of Iterative Method II
n x1 x2 x3 x4
1 0.300000 1.500000 2.700000 -0.900000
2 0.780000 1.740000 2.700000 -0.180000
3 0.900000 1.908000 2.916000 -0.108000
4 0.962400 1.960800 2.959200 -0.036000
5 0.984480 1.984800 2.985120 -0.015840
6 0.993888 1.993824 2.993760 -0.006048
7 0.997536 1.997549 2.997562 -0.002477
8 0.999018 1.999016 2.999013 -0.000979
9 0.999607 1.999607 2.999608 -0.000394
10 0.999843 1.999843 2.999843 -0.000157
11 0.999937 1.999937 2.999937 -0.000063
12 0.999975 1.999975 2.999975 -0.000025
13 0.999990 1.999990 2.999990 -0.000010
14 0.999996 1.999996 2.999996 -0.000004
15 0.999998 1.999998 2.999998 -0.000002
16 0.999999 1.999999 2.999999 -0.000001
17 1.000000 2.000000 3.000000 -0.000000

Table 1: Jacobi’s Method

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 33 / 34
Example of Iterative Method III

n x1 x2 x3 x4
1 0.300000 1.560000 2.886000 -0.136800
2 0.886920 1.952304 2.956562 -0.024765
3 0.983641 1.989908 2.992402 -0.004165
4 0.996805 1.998185 2.998666 -0.000768
5 0.999427 1.999675 2.999757 -0.000138
6 0.999897 1.999941 2.999956 -0.000025
7 0.999981 1.999989 2.999992 -0.000005
8 0.999997 1.999998 2.999999 -0.000001
9 0.999999 2.000000 3.000000 -0.000000
10 1.000000 2.000000 3.000000 -0.000000

Table 2: Gauss-Seidel Method

Clearly, Gauss-Seidel method converges faster than Jacobi’s method

Mridul Sankar Barik (Jadavpur University) CPNM Lecture 14 - Solutions to Linear Simultaneous Equations
2023 34 / 34

You might also like