ECMM171 Programming for Engineers Evangelos Papatheou
[email protected] Lecture 5 1 using MATLAB Partially based on notes from Dr E Cross (University of Sheffield) and Danilo Scepanovic (MIT) Overview • Recap • Linear algebra with MATLAB • Polynomials • Curve fitting – Linear regression - Interpolation • Toolboxes 2 Example – Fibonacci sequence Recap 3 clear; % decide how many terms you want n_terms=input(‘How many terms do you want? ‘); % initializing fb to make loop faster (memory preallocation) fb=zeros(n_terms,1); % using a loop to calculate the terms %count_n=0; for n=1:n_terms if n==1 || n==2 % make sure you initialize the first two terms fb(n)=1; else fb(n)=fb(n-1)+fb(n-2); % standard Fibonacci sequence end if fb(n)<3000 %count_n=count_n+1; n_ind(n,1)=n; fb_3000(n,1)=fb(n); end end ind=n_ind(end); % get the index of the last term less than 3000 fb_n_max=fb_3000(end,1); Linear algebra 4 Given a system of linear equations: 1x+2y-4z=-2 -2x+y+6z=7 3x-2y-5z=3 Linear algebra 5 Also check function “lsqr” Then simply solve: >> x=A\b; This will work for non square matrices as well, it will give a least square solution Given a system of linear equations: 1x+2y-4z=-2 -2x+y+6z=7 3x-2y-5z=3 You can solve it using Matlab Write it in a form Ax=b >> A=[1 2 -4; -2 1 6; 3 -2 -5]; >> b=[-2; 7; 3]; More Linear algebra 6 Number of linearly independent rows or columns If A is square then A\b is the same as inv(A)*b You can simply calculate the determinant of a matrix >> d=det(A); The rank of a matrix The inverse of a matrix >> r=rank(A); >> Ainv=inv(A); Find the eigenvalues and eigenvectors of a matrix: [v,d]=eig(A); Polynomials 7 Evaluate a polynomial at a point ‘x’ Find the roots of a polynomial You can represent a polynomial in Matlab with a vector of its coefficients P=[2 3 1];ଶ P(1) P(2) P(3) P(1) P(2) P(3) P=[4 3 0 3]; ଷ ଶ >> Y=polyval(P,x); >> r=roots(P); x can be a vector and Y will have the same size as x Curve fitting We often use mathematical models to attempt to make sense of data. 8 Least squares regression Linear interpolation Curvilinear interpolation Regression analysis We often use mathematical models to attempt to make sense of data. This is a linear fit between two variables: 9 1 2 3 4 5 6 7 8 9 10 10 20 30 40 50 60 70 80 90 100 110 Variable 1 Va ria bl e 2 By eye we can see that the straight line appears to fit the data well – we would say that the two variables are linearly related. Problem: you have some data and you want to fit a linear function What is the best line of fit for the data? How do we ensure that we draw the best line we can to fit the data? 10 1 2 3 4 5 6 7 8 9 10 0 20 40 60 80 100 120 Variable 1 Va ria bl e 2 Regression analysis How can we quantify how well our line (model) fits the data? Least squares regression 11 2 2.5 3 3.5 4 4.5 5 20 25 30 35 40 45 50 55 60 Variable 1 Va ria bl e 2 The approach of least squares regression is to minimise the vertical distances between each data point and the line we use for the fit: 2 2.5 3 3.5 4 4.5 5 20 25 30 35 40 45 50 55 60 Variable 1 Va ria bl e 2 Least squares regression 12 Model error { The approach of least squares regression is to minimise the vertical distances between each data point and the line we use for the fit: 2 2.5 3 3.5 4 4.5 5 20 25 30 35 40 45 50 55 60 Variable 1 Va ria bl e 2 Least squares regression 13 Model error { We will actually minimise the sum of the squared vertical distances between each data point and the line (model error) Least squares regression When we minimise the sum of the squared vertical distances between each data point and the line (model error), we arrive at equations that will calculate the intercept and slope of the best line for us: 14 Where is the value of and is the mean of , similarly for 15 How can we assess how good our fit is? One way is to use a correlation coefficient Pearson’s Correlation Coefficient: Least squares regression when r=1 there is a perfect fit (perfect positive correlation) ଶ ௧ ௧ ௧ ଶ ଶ Polynomial fitting 16 Fits a second order polynomial to data x,y Type ‘help polyfit’ to see all options Contains the polynomial coefficients You can easily fit polynomials to your data Assume some data vectors x=[x1 x2 … xn] and y=[y1 y2 … yn], then: >> p=polyfit(x,y,2); Use plot to see your fit… desired polynomial order Polynomial fitting - example 17 Define a vector x with more points than your data so you can see visually your fit Find the best third order polynomial to fit your data >> x=[1 2 3 4 5]; >> y=[9.6 89 256 581 997]; >> p=polyfit(x,y,3); Example adapted from Mathworks help p = 0.2833 56.8214 -102.8619 57.3200 Now plot your fit: >> x2=1:0.1:5; >> y2=polyval(p,x2); Estimate the values of ‘y’ for the x2 using the fitted poynomial >> figure,plot(x,y,'o',x2,y2) Polynomial fitting - example 18 >> x=[1 2 3 4 5]; >> y=[9.6 89 256 581 997]; >> p=polyfit(x,y,3); Example adapted from Mathworks help >> figure,plot(x,y,'o',x2,y2) This is still considered a form of (multiple) linear regression, as the relationship among the coefficients is linear Interpolation 19 You have some data and you need to estimate intermediate values between datapoints. Most common approach is a polynomial interpolation. Here you are more certain about your data points, so you want your fit to pass through your points Interpolation 20 You have some data and you need to estimate intermediate values between datapoints. Most common approach is a polynomial interpolation. You need a polynomial that passes through the data so this is more restrictive than curve fitting You can use Linear Algebra to estimate the coefficients ଵ ଶ ଶ ଷ ଵ ଶ ଷ ଵ ଶ ଷ ଵ ଶ ଵ ଶ ଶ ଶ ଷ ଶ ଷ ଵ ଶ ଷ ଵ ଶ ଷ A x =b >> x=A\bImagine you want the coefficients of a parabola passing through data Polynomial interpolation 21 You have some data and you need to estimate intermediate values between datapoints. = ଵଶ + ଶ + ଷ ଵ, ଶ, ଷ → ଵ , ଶ , (ଷ) ଵ ଶ ଵ 1 ଶଶ ଶ 1 ଷଶ ଷ 1 ଵ ଶ ଷ = (ଵ) (ଶ) (ଷ) Such matrices are very often poorly conditioned Use polyfit and polyval If number of data points is equal to number of coefficients then poyfit interpolates >> p=polyfit(x,y,3); >> d=polyval(p,x1); x1 is between min(x) and max(x) cond(A) in Matlab Polynomial interpolation - example 22 You have some data and you need to estimate intermediate values between datapoints. Interpolate to get the temperature at 330 °C >> T=[300 400 500]; >> dens=[0.616 0.525 0.457]; >> p=polyfit(T,dens,2); Values of temperature and density at three points >> dens2=polyval(p,330) dens2 = 0.5863 Example from ‘Applied numerical methods with MATLAB for engineers and scientists’ Steven C. Chapra, McGraw Hill International edition Extrapolation 23 You have to be very careful when extrapolating! You have some data and you need to estimate values outside the range of your datapoints. interpolation extrapolationUse polyfit and polyval again This time your ‘x_new’ will be outside the values of x which was used in polyfit Extrapolation 24 You have to be very careful when extrapolating! You have some data and you need to estimate values outside the range of your datapoints. Use polyfit and polyval again This time your ‘x_new’ will be outside the values of x which was used in polyfit >> T=[300 400 500]; >> dens=[0.616 0.525 0.457]; >> p=polyfit(T,dens,2); >> dens3=polyval(p,600) dens3 = 0.4120 Summary 25 • Recap • Polynomials in MATLAB • Linear algebra – curve fitting • Next week: root finding and function handles
欢迎咨询51作业君