Polynomial Interpolation The polynomial interpolation problem can

advertisement

-20

-30

-40

-10

20

10

0

-10

Polynomial Interpolation

The polynomial interpolation problem can be stated as the following:

๐บ๐‘–๐‘ฃ๐‘’๐‘› ๐‘› + 1 ๐‘‘๐‘Ž๐‘ก๐‘Ž ๐‘๐‘œ๐‘–๐‘›๐‘ก๐‘ , (๐‘ฅ ๐‘–

, ๐‘ฆ ๐‘–

-) ๐‘“๐‘œ๐‘Ÿ ๐‘– = 0,1, … ๐‘› ๐‘“๐‘–๐‘›๐‘‘ ๐‘Ž ๐‘(๐‘ฅ) ∈ ๐’ซ ๐‘›

๐‘ ๐‘ข๐‘โ„Ž ๐‘กโ„Ž๐‘Ž๐‘ก ๐‘(๐‘ฅ ๐‘–

) = ๐‘ฆ ๐‘–

๐‘“๐‘œ๐‘Ÿ ๐‘Ž๐‘™๐‘™ ๐‘– = 0,1, … ๐‘› ๐‘คโ„Ž๐‘’๐‘Ÿ๐‘’ ๐’ซ ๐‘›

๐‘‘๐‘’๐‘›๐‘œ๐‘ก๐‘’๐‘  ๐‘กโ„Ž๐‘’ ๐‘ ๐‘’๐‘ก ๐‘œ๐‘“ ๐‘Ž๐‘™๐‘™ ๐‘๐‘œ๐‘™๐‘ฆ๐‘›๐‘œ๐‘š๐‘–๐‘Ž๐‘™๐‘  ๐‘œ๐‘“ ๐‘‘๐‘’๐‘”๐‘Ÿ๐‘’๐‘’ ๐‘Ž๐‘ก ๐‘š๐‘œ๐‘ ๐‘ก ๐‘›

Example

The blue circles represent the data points, and the red curve represents the interpolating polynomial.

The black line represents a least squares regression line fitted to the same data.

X

Y

0

-10

10

3

20

-30

30

6

40

10

50

-2

60

15

40

30

0 10 20 30 40 50 60 70

Interpolation vs. Regression

Interpolation models must take on the exact values of the known data points, whereas regression models try to minimize the distance between the prediction and each actual known value.

Given n+1 data points, the “best fit” polynomials of degree < n form regression models, where the predictions may not necessarily land exactly on the data points. However, the “best fit” polynomial of degree = n forms the interpolating polynomial which will land exactly on every data point. Notice that this means the sum of squared errors for the interpolating polynomial is always zero.

Interpolation vs. Regression Examples

-2

-3

-1

0

-1

3

2

1

-0.8

-0.6

-0.4

-0.2

0 0.2

0.4

0.6

0.8

1

1.5

1

-0.5

-1

-1.5

-1

0.5

0

-0.8

-0.6

-0.4

-0.2

0 0.2

0.4

0.6

0.8

1

Polynomial interpolation can be used when you believe your measured data to be exact.

Regression models, on the other hand, assume that measurements have noise. The model is usually: ๐‘ฆ = ๐‘“(๐‘ฅ) + ๐œ€ ๐‘คโ„Ž๐‘’๐‘Ÿ๐‘’ ๐œ€~๐‘›(0, ๐œŽ 2 )

The point of a regression model is to estimate f(x), and this estimate can be used for forecasting future and past data values as well as predicting data values between known data points.

However, interpolation models are suitable only for estimating data values between data points.

Runge Phenomenon example

The function being sampled is:

๐‘“(๐‘ฅ) =

1

1+16๐‘ฅ 2

The blue curve is f(x), the red curve is the interpolating polynomial, and the green curve is a spline

5 nodes 9 nodes

0.6

0.4

0.2

0

-0.2

-0.4

-0.6

-1

1.2

1

0.8

0.6

0.4

0.2

0

-0.2

-0.4

-0.6

-1

1.2

1

0.8

-0.8

-0.6

-0.4

-0.2

0 0.2

0.4

0.6

0.8

1

-0.6

-0.4

-0.8

17 nodes

-0.2

0 0.2

0.4

0.6

0.8

1

0.6

0.4

0.2

0

-0.2

-0.4

-0.6

-1

1.2

1

0.8

0.6

0.4

0.2

0

-0.2

-0.4

-0.6

-1

1.2

1

0.8

-0.8

-0.6

-0.4

-0.6

-0.8

21 nodes

-0.4

-0.2

-0.2

0 0.2

0.4

0.6

0.8

1

0 0.2

0.4

0.6

0.8

1

Solving the Polynomial Interpolation Problem

๐‘ˆ๐‘ ๐‘–๐‘›๐‘” {1, ๐‘ฅ, ๐‘ฅ 2 , ๐‘ฅ 3 , … ๐‘ฅ ๐‘› } ๐‘Ž๐‘  ๐‘Ž ๐‘๐‘Ž๐‘ ๐‘–๐‘  ๐‘“๐‘œ๐‘Ÿ ๐’ซ ๐‘› ๐‘Ž๐‘›๐‘ฆ ๐‘(๐‘ฅ) ∈ ๐’ซ ๐‘›

๐‘๐‘Ž๐‘› ๐‘๐‘’ ๐‘ค๐‘Ÿ๐‘–๐‘ก๐‘ก๐‘’๐‘› ๐‘–๐‘› ๐‘กโ„Ž๐‘’ ๐‘“๐‘œ๐‘Ÿ๐‘š: ๐‘(๐‘ฅ) = ๐›ผ

0

+ ๐›ผ

1 ๐‘ฅ + ๐›ผ

2 ๐‘ฅ 2 + โ‹ฏ + ๐›ผ ๐‘› ๐‘ฅ ๐‘› ๐‘“๐‘œ๐‘Ÿ ๐›ผ ๐‘–

∈ โ„

๐‘‚๐‘ข๐‘Ÿ ๐‘๐‘Ÿ๐‘œ๐‘๐‘™๐‘’๐‘š ๐‘๐‘’๐‘๐‘œ๐‘š๐‘’๐‘  ๐‘“๐‘–๐‘›๐‘‘๐‘–๐‘›๐‘” ๐‘กโ„Ž๐‘’ ๐›ผ ๐‘–

๐‘ ๐‘ข๐‘โ„Ž ๐‘กโ„Ž๐‘Ž๐‘ก ๐‘”๐‘–๐‘ฃ๐‘’๐‘› ๐‘› + 1 ๐‘‘๐‘Ž๐‘ก๐‘Ž ๐‘๐‘œ๐‘–๐‘›๐‘ก๐‘ , (๐‘ฅ ๐‘–

, ๐‘ฆ ๐‘–

-) ๐‘“๐‘œ๐‘Ÿ ๐‘– = 0,1, … ๐‘› ๐›ผ

0

+ ๐›ผ

1 ๐‘ฅ ๐‘–

+ ๐›ผ

2 ๐‘ฅ ๐‘–

2 + โ‹ฏ + ๐›ผ ๐‘› ๐‘ฅ ๐‘– ๐‘› = ๐‘ฆ ๐‘–

It is clear that this problem is simply a system of linear equations. We can set up the problem as

[

1 ๐‘ฅ 0

1 ๐‘ฅ

1 ๐‘ฅ ๐‘ฅ

2

0

2

1

โ‹ฏ

… ๐‘ฅ ๐‘ฅ

โ‹ฎ โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ ๐‘›

0 ๐‘›

1

1 ๐‘ฅ ๐‘› ๐‘ฅ 2 ๐‘›

… ๐‘ฅ ๐‘› ๐‘›

] [

๐›ผ 0 ๐›ผ

โ‹ฎ

1 ๐›ผ ๐‘›

]

=

[

๐‘ฆ 0 ๐‘ฆ

โ‹ฎ

1 ๐‘ฆ ๐‘›

]

We can use any normal solution method for solving systems of linear equations to find our solution.

The solution will be the interpolating polynomial ๐‘(๐‘ฅ) = ๐›ผ

0

+ ๐›ผ

1 ๐‘ฅ + ๐›ผ

2 ๐‘ฅ 2 + โ‹ฏ + ๐›ผ ๐‘› ๐‘ฅ ๐‘›

Matlab Implementation function result = poly_interp(x, y)

% x and y are column vectors with the x and y values of the data points

% there are n+1 data points n = length(x) - 1;

% construct the Vandermonde matrix

V = zeros(n+1,n+1); for i=1:n+1 for j=1:n+1

V(i,j) = x(i).^(j-1); end %for end %for

% solve the system of equations alpha = V\y;

%reverse the alpha vector to match Matlab standards

%for polynomial coefficient vectors result = fliplr(alpha');

% plot the solution xx = x(1):0.01:x(n+1); yp = polyval(result,xx); figure(100) plot(x,y, 'bo' ) hold on plot(xx,yp, 'r' ) end %function

Or you can just use the polyfit(x,y,n) function in Matlab. This function will find the polynomial of degree

n, which fits the data x and y best in a least squares sense. However, since the sum of squared errors for the interpolating polynomial is always zero, calling this function with n = #data points – 1 will give you the interpolating polynomial—instead of a regression polynomial—as a result.

Download