0% found this document useful (0 votes)
190 views59 pages

4 Curve Fitting Least Square Regression and Interpolation

Curve fitting is the process of constructing a mathematical function that best fits a series of data points. There are two main types of curve fitting: interpolation, which fits the data exactly, and smoothing, which approximates the data to minimize noise. Least squares regression finds the best-fit linear or nonlinear function by minimizing the sum of the squared residuals between the observed and predicted values. It is commonly used to model relationships in data and design control systems.

Uploaded by

Eyu Kaleb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
190 views59 pages

4 Curve Fitting Least Square Regression and Interpolation

Curve fitting is the process of constructing a mathematical function that best fits a series of data points. There are two main types of curve fitting: interpolation, which fits the data exactly, and smoothing, which approximates the data to minimize noise. Least squares regression finds the best-fit linear or nonlinear function by minimizing the sum of the squared residuals between the observed and predicted values. It is commonly used to model relationships in data and design control systems.

Uploaded by

Eyu Kaleb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 59

4.

Curve Fitting
Compiled by:
Habtamu G.

1
Curve fitting is the process of constructing a curve,
or mathematical function, that has the best fit to a series
of data points possibly subject to constraints.

There are many cases that curve fitting can prove useful:
• quantify a general trend of the measured data: data
visualization, where no data are available
• remove noise from a function
• extract meaningful parameters from the learning
curve
• summarize the relationships among two or more
variables
2
Curve fitting can involve either interpolation, where an exact fit to the data is
required, or smoothing, in which a best fit function is constructed that
approximately fits the data.
1. Best Fit
The measured data points are assumed to have noisy or disturbance. So,
we don’t want to fit a curve that intercepts. Our goal is to learn a function
that minimizes some predefined error on the given data points.
Example: linear regression

2. Exact Fit
The samples are assumed not noisy, and we
want to learn a curve that passes through each
point. Example: polynomial interpolation.

3
Curve Fitting/ Least-Squares Regression
-Linear Regression
-Polynomial Regression
-Multiple Linear Regression
Curve Fitting/ Interpolation
-Newton's Divided-Difference Interpolating
Polynomials
-Lagrange Interpolating Polynomial

4
Curve Fitting/Least‐Square
Regression

5
I. Least‐Square Regression
It is a statistical procedure to find the best fit for a set of data points by
minimizing the sum of the squares of the offset or residuals (difference
between an observed value and the fitted value).

When we square each of those errors and


add them all up, the total is as small as
possible (that is why it is called "least
squares").

6
Least‐Square Regression
Polynomial curves fitting points generated with a
sine function. The black dotted line is the "true"
data, the red line is a first degree polynomial, the
green line is second degree, the orange line
is third degree and the blue line is fourth degree.

least squares regression in engineering:


1. Process control: to model the relationship
between process parameters and product quality.
2. Control systems: to model behavior of systems
and design controllers. E.g., In an aircraft, to model
the r/ship between pilot input and aircraft
response, and to design a controller that can
regulate the aircraft's performance.
3. Structural analysis: to model the behavior of
structures and to predict their response to external
loads.
4. Data science: Artificial … (finance, …) 7
a. Linear Regression
Is to derive a straight line to represent the general trend of the data
7

y = a 0 + a1 x + e 6

y
True value or Approximate Error or 3

measured value value, or residual 2

ymeasured ymodel 1

ao: intercept, a1: slope


0 1 2 3 4 5 6 7 8
x

Square of residual or error (S r): 7


y = 0.071429 + 0.83929x
n n 6

S r =  e =  (y i,measured − y i,model )2 =
2
i
5

i=1 i=1 4

y
3

n: total number of points


0 1 2 3 4 5 6 7 8
8
x
To determine values of ao and a1
n n n
S r n
0 =  y i −  a o −  a1 x i
= −2 (y i − a0 − a1xi ) = 0
a0 i=1 i=1 i=1 i=1
n n n n
Sr
= −2 (y i − a 0 − a1 x i )x i = 0 0 =  y i x i −  a o x i −  a 1x2i
a1 i=1 i=1 i=1 i=1

n
Note: a o = n ao
i=1

n n
   n 
na o + a1  x i =  y i
n

 n  xi  a
 o   yi 
i=1 i=1
 n i=1
n    =  ni=1 
 x 2 a1   xy 
  
n n n
a o  x i +a1  x2i =  y i x i
x i  i i
i=1
i
i=1  
 i=1 
i=1 i=1 i=1

y = a 0 + a1 x 9
Example 1: Fit a straight line to the given data

xi 1 2 3 4 5 6 7
yi 0.5 2.5 2.0 4.0 3.5 6.0 5.5

Solution:
xi yi xiyi x 2i
1 0.5 0.5 1
2 2.5 5 4
3 2.0 6.0 9
4 4.0 16 16
5 3.5 17.5 25
6 6.0 36 36 a1 = 0.8392857
7 5.5 38.5 49
ao = 0.07142857
xi yi xiyi i x2
=28 =24 =119.5 =140 y = a 0 + a1 x

y = 0.07142857 + 0.8392857x 10
Exercise: Dereje found how many hours of sunshine vs how many ice creams were sold at
the shop from Monday to Friday: draw a straight line fit to the given data(y = mx + b)
and compute the error.
Solution methodology:

11
Linearization of Nonlinear Relations
(1) Exponential Function y

y = a1e b1x
y = a1e b1x
ln(y) = ln(a1 e b1x ) x
= ln(a1 )+ ln(e b1x )
= ln(a1 )+ b1x Ln(y)

ln(y) = ln(a1 ) + b1x slope= b1

compare with: intercept= ln(a1)

Y = a 0 + a1 X x
12
Linearization of Nonlinear Relations
(2) Power Equation y

y = a2 x b2
y = a2x b2

Log(y)
slope= b2

Log(y) = Log(a 2 ) + b 2Log(x)


Log(x)
compare with: intercept= Log(a2)
Y = a 0 + a1 X
13
Linearization of Nonlinear Relations
(3) Saturation Growth Rate y

1
y

slope= b3/a3

intercept= 1/a3

compare with: 1

Y = a 0 + a1 X 14
More examples …

15
16
x y Log(x) Log(y) Log(x)Log(y) [Log(x)]2
1 0.5 0 ‐0.301 0 0
2 1.7 0.301 0.226 0.068 0.091
3 3.4 0.477 0.534 0.255 0.228

4 5.7 0.602 0.753 0.453 0.362


5 8.4 0.699 0.922 0.644 0.489
=2.079 =2.134 =1.42 =1.17

 5 2.079 Log(a 2 )  2.134 Log(a2 ) = −0.3


2.079 1.17  b  =  1.42  b 2 = 1.75
  2   

Log(y) = Log(a 2 ) + b 2Log(x) Log(y) = 1.75Log(x) − 0.3


17
b. Polynomial Regression
Is to derive a curve line to represent the general trend of the data
Example: 2nd order polynomial or “ Quadratic” or “Parabola”

y = a 0 + a1 x + a 2 x 2 + e
Square of residual or error (Sr ):
To determine values of ao , a1 and a2

18
19
Example: Fit a 2nd order polynomial to the given data

xi 0 1 2 3 4 5
yi 2.1 7.7 13.6 27.2 40.9 61.1
Solution:
xi yi xi 2 x3i x4i xiyi xi2yi
0 2.1 0 0 0 0 0
1 7.7 1 1 1 7.7 7.7
2 13.6 4 8 16 27.2 54.4
3 27.2 9 27 81 81.6 244.8
4 40.9 16 64 256 163.6 654.4
5 61.1 25 125 625 305.5 1527.5
 =15 =152.6 =55 =225 =979 =585.6 =2488.8
20
y = a 0 + a1 x + a 2 x 2

a 2 = 1.86071
Use Cramer’s or Gauss elimination:
a1 = 2.35929
a0 = 2.47857

y = 2.47857 + 2.35929x + 1.86071x 2 21


c. Multiple Linear Regression
Estimates the relationship between a quantitative dependent variable and
two or more independent variables using a straight line.
y = f n (x1 , x 2 , x 3 , x 4 ,..., x n )

22
Multiple Linear Regression …
Example: 2D, the regression line becomes a plane

y = a 0 + a1 x1 + a 2 x 2 + e
Square of residual or error (Sr ):

To determine values of ao , a1 and a2


S r n
= −2 (y i − a 0 − a1 x1 i −a 2 x 2i ) = 0 (1)
a 0 i=1

S r n
= −2 x1i (y i − a0 − a1x1 i −a2x2i ) = 0 (2)
a1 i=1

S r n
= −2 x2i (y i − a0 − a1x1 i −a2x2i ) = 0 (3)
a 2 i=1
23
y = a 0 + a1 x1 + a 2 x 2
n
Note: a o = n ao
i=1

24
Example: Use multiple linear regression to fit the following data

Solution:

25
y = a 0 + a1 x1 + a 2 x 2

Use Cramer’s or Gauss a2 = −3


elimination: a1 = 4
a0 = 5

y = 5 + 4x1− 3x 2 26
Exercise:
Determine a function of two variables:

f(x,t) = a + b x + c t
t 0 1 2 3

x 0.1 0.4 0.2 0.2

y 3 2 1 2

That best fits the data with the least sum of the square of errors.

27
MORE…
Power Equation: y = b x x ...x
b1
0 1
b2
2
bm
m

Log(y) = Log(b 0 ) + b1Log(x1 ) + b 2Log(x 2 ) + ... + b m Log(x m )

Compare with:

Y = a 0 + a 1 X1 + a 2 X 2

28
Curve Fitting/
Interpolation

1 29
II. Interpolation
Interpolation: is to fit a straight lines or curves that pass
directly through each of the points.
Remember that the least square regression is to derive a single straight line or curve
to represent general trend of the data, we make no effort to interest every point

f (x)
f (x)

x x
Linear Interpolation Curvilinear Interpolation

Question: what is the value of y at x=2.5?


x 1 2 3 4
y 6 8 11 15
30
Polynomial Interpolation:
The most common method is polynomial interpolation:

f (x) = a 0 + a1x + a 2 x 2 + ... + a n xn nth-order polynomial


(n+1) data points

f (x) f (x) f (x)

x x x
1st
order (linear) 2nd order (quadratic 3rd order (cubic)
Connecting 2 points or parabolic) Connecting 4 points
Connecting 3 points

31
Forms of Interpolating Polynomial
(1) Newton’s Divided-Difference
Divided differences are calculated using divided difference of a
smaller number of terms:
- Linear Interpolation (1st order)
- Quadratic Interpolation (2nd order)
- General form Polynomials

(2) Lagrange
32
a. Newton’s Divided-Difference Interpolation
i. Linear Interpolation (1st order Polynomial)
f1 (x) = b0 + b1 (x − x0 ) (1)
Linear interpolation is to connect 2 data points with a straight line.

Using Similar Triangles

f1 (x) − f (x 0 ) f (x1 ) − f (x 0 )
= f (x1 )
x − x0 x1 − x 0
f1 (x)
f (x1 ) − f (x 0 )
f1 (x) = f (x0 ) + (x − x 0 )
x1 − x 0 f (x o )
Linear Interpolating Formula
b 0 = f (x o ) b 1 = f (x1 ) − f (x o )
Compare with (1): x1 x
x1 − x o 33
Example: Estimate ln(2) using linear interpolation for
interval [1 4]
Solution f (x1 ) − f (x 0 )
f1 (x) = f (x 0 ) + (x − x 0 )
x1 − x 0
f (4) − f (1) ln(4) − ln(1)
f1 (x) = f (1) + (x − 1) = ln(1) + (x − 1)
4−1 4−1
= 0.462098(x − 1)
f1 (x) = 0.462098(x − 1)

At x=2, then
True value:
f1 (2) = 0.462098 Ln(2) = 0.693147

0.693147 − 0.462098
t = 100% = 33.3%
0.693147
34
Example: Estimate f(4.5) using linear interpolation
from the given table.

Question: what will be x at f(x)=22?

35
Exercise: The upward velocity of a rocket is given as a
function of time in Table. Find the velocity at t=16 seconds
using the Newton Divided Difference method for linear
interpolation.
Table. Velocity as a
function of time
t (s) v(t ) (m/s)
0 0
10 227.04
15 362.78
20 517.35
22.5 602.97
30 901.67 Answer
v(t ) = b0 + b1 (t − t0 )
Figure. Velocity vs. time data = 362.78 + 30.914(t − 15),
for the rocket example
v(16) = 362.78 + 30.914(16 − 15)
= 393.69m / s
36
ii. Quadratic Interpolation (2nd order Polynomial)
“Parabolic”
2nd order (quadratic or parabolic) is connecting 3 points

37
Example: Estimate Ln(2) using 2nd order polynomial to 3
points [1 4 6]
Solution
b 0 = f (x o ) b 0 = f (1) = Ln(1) = 0
f (x1 ) − f (x o ) f (4) − f (1) Ln(4) − Ln(1)
b1 = b1 = = = 0.4620981
x1 − x o 4−1 4−1

38
Exercise: The upward velocity of a rocket is given as a function of time in
the table. Find the velocity at t=16 seconds using the Newton Divided
Difference method for quadratic interpolation.
t (s) v(t ) (m/s)
0 0
10 227.04
15 362.78
20 517.35
22.5 602.97
30 901.67

v(t ) = 227.04 + 27.148(t − 10) + 0.37660 (t − 10)(t − 15), 10  t  20


The absolute relative approximate error a obtained between the results from the first
order andv(16) = 392.19
second m/s
order polynomial is

The absolute relative approximate .19 −393


392error .69
a obtained between the results from the first
a = x100
order and second order polynomial is392 .19
39
= 0.38502 % − 393 .69
392 .19
a = x100
392 .19
iii. General Form of Interpolating Polynomial
fn (x) = b 0 + b1 (x − x 0 ) + b 2 (x − x 0 )(x − x1 ) + b 3 (x − x 0 )(x − x1 )(x − x 2 ) + ...
+ bn (x − x 0 )...(x − x n−1 ) (1)
b 0 = f (x o )
b1 = f[x1 , x o ]
b 2 = f[x 2 , x1 , x o ]
b 3 = f[x 3 , x 2 , x 1 , x o ]

b n = f[x n , x n−1 ,..., x1 , x o ]


Where the bracketed function evaluations are:

FINITE DIVIDED DIFFERENCES

40
Examples:

41
(4) The nth finite divided difference is represented generally
as

42
fn (x) = b 0 + b1 (x − x 0 ) + b 2 (x − x 0 )(x − x1 ) + b 3 (x − x 0 )(x − x1 )(x − x 2 ) + ...
+ bn (x − x 0 )...(x − x n−1 ) (1)

fn (x) = f (xo ) + (x − x0 )f[x1 , xo ] + (x − x0 )(x − x1 )f[x2 , x1 , x0 ]+


(x − x0 )(x − x1 )(x − x 2 )f[x3 , x 2 , x1 , x0 ] + ...
+ (x − x0 )(x − x1 )...(x − xn−1 )f[xn , xn−1 ,...xo ]

Newton’s divided-difference
interpolating polynomial

43
44
f3 (x) = 0.4620981(x − 1) − 0.0518732(x − 1)(x − 4) + 0.00786555075(x − 1)(x − 4)(x − 6)
At x=2
f 3 (2) = 0.6287686
45
Exercise: Given the following data. Calculate f(4) using Newton’s
interpolating polynomials of order 1 through 4.

46
47
f1 (4) = b o + b 1 (x − x o )
= 19 + 40(4 − 3) = 59

f 2 (4) = f1 (4) + b 2 (x − x o )(x − x 1 )


= 59 + 9(4 − 3)(4 − 5) = 50

f 3 (4) = f 2 (4) + b 3 (x − x o )(x − x 1 )(x − x 2 )


= 50 + 1(4 − 3)(4 − 5)(4 − 2) = 48

f 4 (4) = f 3 (4) + b 4 (x − x o )(x − x 1 )(x − x 2 )(x − x 3 )


= 48 + 0(4 − 3)(4 − 5)(4 − 2)(4 − 7) = 48

48
Exercise: Use quadratic polynomial of Newton’s divided
difference interpolating to find x of f(x)=0.3 for the
following data

x 2 3 4
We are talking about
f(x) 0.5 0.3333 0.25 inverse interpolation

Ans: x = 3.295842

49
Ex…: Construct the Newton Divided Difference Table for
generating Newton interpolation polynomial with the following
data set: i 0 1 2 3 4
0 1 2 3 4
0 1 8 27 64

Note: when the order of polynomial is not given, you need to calculate for the
maximum data. For this exercise there are n=5 data and thus you need to find the 50
order curve n-1 or simply 4
b. Lagrange Interpolating Polynomials
It is derived from Newton’s Polynomials

 :Product of
Examples:
(1) 1st order Lagrange Polynomial, n=1 (Linear version)
1
f1 (x) =  L i (x)f (x i ) = L 0 (x)f (x o ) + L1 (x)f (x1 )
i=0

1 x − xj x − x1 1 x − xj x − xo
L 0 (x) = 
j=0
=
x 0 − x j x 0 x1
L1(x) = 
j=0 x1 x j
=
x1 x 0
ji→j0 ji→j1

51
(2) 2nd order Lagrange Polynomial, n=2
2
f2 (x) =  Li (x)f (xi ) = L0 (x)f (xo ) + L1 (x)f (x1 ) + L2 (x)f (x 2 )
i=0

2 x − xj x − x1 x − x 2
L 0 (x) = 
j=0 x0 − x j
=
x 0 − x1 x 0 − x 2
ji→j0

2 x − xj x − xo x − x 2
L1(x) = 
j=0
=
x 1 − x j x1 − x 0 x1 − x 2
ji→ j1

2 x − xj x − x 0 x − x1
L 2 (x) = 
j=0
=
x 2 x j x 2 x0 x 2 x1
ji→j2

x − x1 x − x 2 x − x0 x − x2 x − x 0 x − x1
f 2 (x) = f (x o ) + f (x1 ) + f (x 2 )
x 0 − x1 x 0 − x 2 x1 − x 0 x1 − x 2 x 2 − x 0 x 2 − x1
52
(3) 3rd order Lagrange Polynomial, n=3

53
Example: Use Lagrange interpolating polynomial of the 1st order to
evaluate Ln(2) to two points [1 4]
Solution
xo = 1
x1 = 4

x−4 x−1
f1 (x) = Ln(1) + Ln(4)
1− 4 4−1

At x=2
2−4 2−1
f1 (2) = Ln(1) + Ln(4) = 0.4620981
1− 4 4−1

54
Example: Use Lagrange interpolating polynomial of the 2nd
order to evaluate Ln(2) to three points [1 4 6]

Solution
xo = 1, x1 = 4, x2 = 6
x − x1 x − x 2 x − x0 x − x2 x − x 0 x − x1
f 2 (x) = f (x o ) + f (x1 ) + f (x 2 )
x 0 − x1 x 0 − x 2 x1 − x 0 x1 − x 2 x2 − x0 x 2 − x 1

x−4 x−6 x−1 x− 6 x−1 x− 4


f 2 (x) = Ln(1) + Ln(4) + Ln(6)
1− 4 1− 6 4−1 4− 6 6−1 6− 4
At x=2
2−4 2−6 2−1 2− 6 2−1 2− 4
f 2 (2) = Ln(1) + Ln(4) + Ln(6) = 0.5658444
1− 4 1− 6 4−1 4− 6 6−1 6− 4

55
Exercise: Evaluate f(4) using Lagrange interpolating
polynomial of order 1 through 3
x 3 5 2 7 1
f(x) 19 99 6 291 3
Solution
xo = 3, x1 = 5, x2 = 2, x3 = 7, x4 = 1

1st order
x − x1 x − x0
f1 (x) = f (x o ) + f (x1 )
x 0 − x1 x1 − x 0

4−5 4− 3
f1 (4) = (19) + (99) = 59
3−5 5−3

56
57
Question for You!
1. 2. Find the value of y at x = 0 given some set of values (-2, 5), (1, 7),
(3, 11), (7, 34). (Ans: Value of y at (x = 0) = 1087/180)

2. Find Solution of f(x) at x=301) using Lagrange's Interpolation


formula.
x 300 304 305 307 Ans: f(301)=2.4786
y 2.4771 2.4829 2.4843 2.4871

3. Evaluate f(4) using Lagrange interpolating polynomial of order 1 through 3

x ‐2 1 3 5 6 10 13 20
f(x) 1.5 10 19 99 6 291 3 32
58
Additional online examples
1. https://www.brainkart.com/article/Lagrange-s-interpolation-
formula_38963/
2. https://engcourses-uofa.ca/books/numericalanalysis/polynomial-
interpolation/newton-interpolating-polynomials/
3. https://www.cuemath.com/lagrange-interpolation-formula/
4. https://techindetail.com/lagrange-interpolation-formula/

59

You might also like