0% found this document useful (0 votes)
115 views20 pages

CE206 Curvefitting Interpolation 4

1. Curve-fitting and interpolation techniques are used to determine values at undefined points based on discrete sampled data. Least-squares regression is commonly used for trend analysis and hypothesis testing by minimizing the sum of the squared residuals between observed and predicted values. 2. Linear regression finds the linear relationship between two variables by minimizing the sum of squares of the residuals between observed and model-predicted values. This can be generalized to polynomial and multiple linear regression. 3. The general linear least squares model relates the response to a set of basis functions through unknown coefficients. This formulation allows linear, polynomial and multiple regression models to be solved using matrix operations.

Uploaded by

afsana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
115 views20 pages

CE206 Curvefitting Interpolation 4

1. Curve-fitting and interpolation techniques are used to determine values at undefined points based on discrete sampled data. Least-squares regression is commonly used for trend analysis and hypothesis testing by minimizing the sum of the squared residuals between observed and predicted values. 2. Linear regression finds the linear relationship between two variables by minimizing the sum of squares of the residuals between observed and model-predicted values. This can be generalized to polynomial and multiple linear regression. 3. The general linear least squares model relates the response to a set of basis functions through unknown coefficients. This formulation allows linear, polynomial and multiple regression models to be solved using matrix operations.

Uploaded by

afsana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Curve-fitting and Interpolation

Curve-fitting and engineering practice


sampling data are acquired
at discrete points.
- but the values at undefined points
are wanted in many engineering
applications

Interpolation
- Ex: density of water at different
temp.

Least-squares Regression
- Purpose is trend analysis or
hypothesis testing

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Trend of experimental data
Experimental data for force (N) and velocity (m/s) from a
wind tunnel experiment

v, m/s 10 20 30 40 50 60 70 80
F, N 25 70 380 550 610 1220 830 1450

1500

Is the relationship
1000
-linear? Which order?
(Linear or polynomial
F (N)

regression)
500

-nonlinear?
0
(Nonlinear regression)
0 20 40 60 80
v (m/s)

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Fitting the data with a straight line
Minimizing the sum of squares of the estimated residuals
n n
Sr e yi a0 a1 xi
2 2
i
i1 i1

n xi yi xi yi
a1
n x x
2
2
i i

a0 y a1 x

Testing the goodness of fit


n
St yi y
2

St Sr i 1
r
2

St n
S r yi a0 a1 xi
2

i 1
(CE 205 lecture slides of details)
CE 206: Engg. Computation sessional Dr. Tanvir Ahmed
A MATLAB code for linear regression
function [a, r2] = linregr(x,y)
% input:
% x = independent variable
% y = dependent variable
% output:
% a = vector of slope, a(1), and intercept, a(2)
% r2 = coefficient of determination
n = length(x);
if length(y)~=n, error('x and y must be same length'); end
x = x(:); y = y(:); % convert to column vectors
sx = sum(x); sy = sum(y);
sx2 = sum(x.*x); sxy = sum(x.*y); sy2 = sum(y.*y);
a(1) = (n*sxy-sx*sy)/(n*sx2-sx^2);
a(2) = sy/n-a(1)*sx/n;
r2 = ((n*sxy-sx*sy)/sqrt(n*sx2-sx^2)/sqrt(n*sy2-sy^2))^2;
% create plot of data and best fit line
xp = linspace(min(x),max(x),2);
yp = a(1)*xp+a(2);
plot(x,y,'o',xp,yp)
grid on
CE 206: Engg. Computation sessional Dr. Tanvir Ahmed
Fitting a straight line
1600
>> [a, r2] = linregr(v,F)
1400
a =
1200
19.4702 -234.2857
1000
r2 =
0.8805 800

600

F 234.2857 19.4702v 400

200

3.5
0

-200
10 20 30 40 50 60 70 80
3

2.5
Log-transformed data
>> [a, r2] =
2 linregr(log10(v),log10(F))
a =
1.5 1.9842 -0.5620
r2 =
1
1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2
0.9481

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Variety of linear regressions
To fit a second order polynomial
n n
Sr e yi a0 a1 xi a2 xi2
2 2
i
i1 i1

Higher order polynomials


n n
Sr e yi a0 a1 xi a x a x
2
i
2
2 i
m 2
m i
i1 i1

Multiple linear regression


a0 a1x1 a2 x2 am xm
y

n n
Sr ei2 yi a0 a1 x1,i a2 x2,i am xm,i
2

i1 i1

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Generalizing least squares
Linear, polynomial, and multiple linear regression all
belong to the general linear least-squares model:
a0 z0 a1z1 a2 z2 am zm e
y

where z0, z1, , zm are a set of m+1 basis functions and
e is the error of the fit

y Z a e
z01 z11 zm1
z02 z12 zm2
Z

zmn
z0n z1n
zji represents the the value of the jth basis function
calculated at the ith point.

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed
Generalizing least squares
y Z a e
Applying the condition of least squares
2
n
n m
Sr ei2 yi a j z ji

i1 i1 j0
We get the following:
Z Z a Z y
T T


We can solve for the coefficients {a} using matrix
manipulations

To calculate residuals: {y} - [Z]{a}

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Example: general least squares
Fit the wind tunnel data to a 2nd order polynomial
v, m/s 10 20 30 40 50 60 70 80
F, N 25 70 380 550 610 1220 830 1450

Create the Z matrix:


>>v=v(:);
>>Z=[ones(size(v)) v v.^2]
Z =
1 10 100
1 20 400
1 30 900
1 40 1600
1 50 2500
1 60 3600
1 70 4900
1 80 6400

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Example: general least squares
Determine the coefficient matrix:
>>Z*Z
ans =
8 360 20400
360 20400 1296000
20400 1296000 87720000

>>F=F(:);
>>Z*F
ans =
5135
>>a=(Z*Z)\(Z*F)
312850 a =
20516500 -178.4821
16.1220
0.0372

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


polyfit and polyval function
MATLAB has a built-in function polyfit that fits
a least-squares nth order polynomial to data:
p = polyfit(x, y, n)
x: independent data
y: dependent data
n: order of polynomial to fit
p: coefficients of polynomial f(x)=p1xn+p2xn-1++pnx+pn+1

MATLABs polyval command can be used to


compute a value using the coefficients.
y = polyval(p, x)

Exercise: Modify your linregr.m function to fit


polynomials of any order using the built-in functions
CE 206: Engg. Computation sessional Dr. Tanvir Ahmed
Nonlinear regression
Not all fits are linear equations of coefficients and
basis functions.

Problems in transformation:
-Not all equations can be transformed easily or at all
-The best fit line represents the best fit for the transformed
variables, not the original variables.

Alternate approach: nonlinear regression.

Methodology:
-Write down the basis function.
-use MATLABs fminsearch function to find the
values of the coefficients where a minimum occurs

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Example: nonlinear regression
For the wind tunnel data, determine the coefficients
for the fit: a2
F a1v
The m-file function:
function f = fSSR(a, xm, ym)
yp = a(1)*xm.^a(2);
f = sum((ym-yp).^2);

Then, use fminsearch to obtain the values of a that


minimize fSSR:
a = fminsearch(@fSSR, [1, 1], [], v, F)

a = Initial guess for a1 and a2 Option


placeholder
Data

2.5384 1.4359
CE 206: Engg. Computation sessional Dr. Tanvir Ahmed
Comparison between regressions
F 0.2741v1.9842 transformed
F a1v a2

F 2.5384v1.4359 untransformed
2000

1500
untransformed
F (N)

1000

500
transformed
0
0 20 40 60 80
v (m/s)
CE 206: Engg. Computation sessional Dr. Tanvir Ahmed
Polynomial interpolation
(1) Use MATLABs polyfit and polyval function
- the number of data points should be equal to the number of coefficients

(2) Newtons interpolating polynomial


x b1 b2 x x1 bn x x1 x x2 x xn1
fn1

b1 f x1
b2 f x2 , x1
b3 f x3 , x2 , x1
Divided
Differences

n f xn , xn1 ,, x2 , x1
b

(3) Lagranges interpolating polynomial


CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Polynomial interpolation
Use polyfit and polyval function to predict the population
of 2000 using a 7th order polynomial fitted with the first 8 points
Year 1920 1930 1940 1950 1960 1970 1980 1990 2000
Populati 106.46 123.08 132.12 152.27 180.67 205.05 227.23 249.46 281.42
on (mil)

Also try using:


(a) Newtons interpolating
polynomial
(b) Using a 1st order
polynomial using the
1980 and 1990 data
(c) Using a 2nd order
polynomial using the
1980,1990 and 1970 data

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Piecewise interpolation in MATLAB
The spline command performs cubic interpolation, generally
using not-a-knot method
yy=spline(x, y, xx)

Example: interpolate the Runges function f(x) = 1/(1+25x2)


taking 9 equidistant points between -1 and +1 using the spline
command.

x = linspace(-1, 1, 9);
y = 1./(1+25*x.^2);
xx = linspace(-1, 1);
yy = spline(x, y, xx);
yr = 1./(1+25*xx.^2)
plot(x, y, o, xx, yy, -, xx, yr, --)

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


MATLABs interp1 function

MATLABs interp1 function can perform several


different kinds of interpolation:

yi = interp1(x, y, xi, method)


x and y contain the original data
xi contains the points at which to interpolate
method is a string containing the desired method:
nearest - nearest neighbor interpolation
linear - connects the points with straight lines
spline - not-a-knot cubic spline interpolation
pchip or cubic - piecewise cubic Hermite interpolation

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed


Example: interp1 function
Time series of spot measurements of velocity:
t 0 20 40 56 68 80 84 96 104
v 0 20 20 38 80 80 100 100 125

140

120

100

>> tt=linspace(0,110); 80
>> v1=interp1(t,v,tt);
>> plot(t,v,'o',tt,v1) 60

40

20

0
0 20 40 60 80 100 120

CE 206: Engg. Computation sessional Dr. Tanvir Ahmed

You might also like