0% found this document useful (0 votes)
4 views

Chapter Four

Chapter Four covers interpolation and curve fitting techniques, including finite differences, Newton's forward and backward difference interpolation, and Lagrange interpolation. It discusses the method of least squares for fitting curves to data points and provides examples and exercises for practical application. The chapter emphasizes the importance of finding the 'best fit' equation to predict unknown values from given data.

Uploaded by

Hailu Feleke
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Chapter Four

Chapter Four covers interpolation and curve fitting techniques, including finite differences, Newton's forward and backward difference interpolation, and Lagrange interpolation. It discusses the method of least squares for fitting curves to data points and provides examples and exercises for practical application. The chapter emphasizes the importance of finding the 'best fit' equation to predict unknown values from given data.

Uploaded by

Hailu Feleke
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Chapter Four

Interpolation and Curve Fitting


4.1 Introduction

1
4.2 Finite Differences

2
3
Example:-

 Backward difference operator (  )

4
Example:-

5
 Divided Differences

6
Table of Divided Differences

Example:- Construct the divided difference table for the following Data

7
4.3 Interpolation for Equally Spaced Data

4.3.1. Newton’s Forward Difference Interpolation

8
Example

9
Exercises ( From Lecture)

1. Consider the following set of points (10,46),(20,66),(30,81),(40,93) and (50,101).If there is a


function f satisfying at these points,
i) Construct the difference table
ii) Using (i) above, determine f (15) , Ans( 56.8672 correct to 4 decimal place)
2. Values of x ( in degrees) and sin x are given in the following table

Determine the value of sin 38 0 ( Hint:- Construct the difference table and Newton’s Backward
difference interpolation formula) Ans. 0.6156614

Exercises ( Computer Lab)

Write a Mathlab program for Newton’s forward difference formula and Newton’s backward
difference formula and solve the above problems using it.

10
4.4 Interpolation with Unequally Spaced data
4.4.1 Lagrange interpolation

11
Example:-

4.4.2 Newton’s Divided difference interpolation


To apply this method recall how to form divided difference table.

Derivation of Newton’s divided difference formula

12
Example

13
Exercise

1. Determine a quadratic polynomial using the following points


( x0 , y0 )  (2,4), ( x1 , y1 )  (0,2) and ( x2 , y2 )  (2,8) by using Lagrange interpolation.
(Ans. f ( x)  x 2  x  2 )
2. Find the interpolating polynomial by Newton’s divided difference formula fot the
following table and calculate f (2.1).
x 0 1 2 4
f (x) 1 1 2 5
1 3 2
( Ans. f ( x)  x 3  x 2  x  1 , f (2.1)  2.135 )
12 4 3
3. Write a program for both Lagrange and Newton’s divided difference formula using
Matlab and solve the above problems
4.5 Curve fitting by the method of Least squares
4.5.1 Fitting a straight line
We may have sets of data points. Several equations of different type can be obtained
to express the given data approximately. But the problem is to find the equation of the
curve of ‘best fit’ which may be most suitable for predicting the unknown values. This
process of finding such equation of ‘best fit ‘ is called curve fitting.
The most common approach to ‘best fit’ approximation is to minimize the sum of the
squares of the differences between the data values and the values of the approximating
function. This is the method of least squares. We first investigate linear least square
approximations also known as linear regression and then quadratic least squares
functions ( even it is possible to extend higher order polynomial functions).
Given that there are set of points ( xi , yi ) for i  0,1,2,..., n . Our aim is to find
y  f ( xi ; c0 , c1 , c 2 , .. ., c n ) where y  y i  ei is minimum. We want to approximate the
data by a function of the form f ( x)  c0  c1 x  c2 x 2  c3 x 3  . . .  cn x n . Let ( xi , yi )
denote the ith data point and let N be the number of points for which we wish to fit the
data. The error of the ith point is given by the difference between the actual value y i and
the estimated values f ( xi ). Thus
ei  yi  f ( xi )
 yi  (c0  c1 xi  c2 xi  . . .  cn xi )
2 n

And the sum of the squares of the errors is therefore

14
N N

 ei   ( yi  c0  c1 xi  c2 xi  . . .  cn xi ) 2 where N  1  n
i 1
2

i 1
2 n

From Calculus of functions of several variables, S will be minimum when its partial
derivatives with respect to the C’s are equal to zero, that is
S N

  2( y i  c0  c1 xi  c 2 xi  ...)(1)  0 
2

c0 i 1

S N
  2( y i  c0  c1 xi  c 2 xi  ...)( xi )  0 
2

c1 i 1 

S N
  2( y i  c0  c1 xi  c 2 xi  ...)( xi )  0
2 2

c 2 i 1 
 

 
 
S N 
  2( y i  c0  c1 xi  c 2 xi  ...)( xi )  0
2 n

c n i 1 

This could be reduced as


N N N

c0 N  c1  xi  . . .  cn  xi   yi
n

N N
i 1 i 1
N
i 1
N 
c0  xi  c1  xi  . . .  cn  xi   xi yi 
2 n 1

i 1 i 1 i 1 i 1

N N N N 
c0  xi  c1  xi  . . .  cn  xi   xi yi 
2 3 n2 2

i 1 i 1 i 1 i 1  (*)
 

 
 
N N N N 
c0  xi  c1  xi  . . .  cn  xi   xi yi 
n n 1 nn n

i 1 i 1 i 1 i 1 
The above equation (*) is a system of n+1 linear equations and it is called a normal
equation.
Example :- consider the data (1,2.1), (2,2.9), (5,6.1) and (7,8.3) . Then determine the
linear line which best fit the above data points by using least square method.
Solution.
Here we need to find a function f ( x)  c0  c1 x which best fit o the above data points. i.e,
we need to determine c0 and c1 .So let us construct the following table ( here our N=4)
which mean we have 4 sets of data points)

15
2
xi yi xi xi y i
1 2.1 1 2.1
2 2.9 4 5.8
5 6.1 25 30.5
7 8.3 49 58.1
x i  15 y i  19 .4 x i
2
 79 x i y i  96 .5

Now using (*) or the normal equation


4 4
c0 .4  c1  xi   yi
4c0  15c1  19.4
i 1 i 1

4 4 4
15c0  79c1  96.5
c0  xi  c1  xi   xi yi
2

i 1 i 1 i 1

Solving this simultaneous equations we got c0=0.9352 and c1=1.0440. So that the
required line is f ( x)  c0  c1 x  1.0440 x  0.9352 .
Fitting a polynomial of degree n, we need to determine c0 , c1 , c2 , ..., cn from the normal
equation. Using Matlab, we can see the line together with the set of data points.

6 y=1.0440x+0.9352

5
y

1
1 2 3 4 5 6 7
x

Figure: Best fit linear function with corresponding data points

16

You might also like