Least-Squares Fitting
- Least-Squares Line
- Power Fit
- Data Linearization
- Linear Least-Squares
- Nonlinear Least-Squares
Least-Squares Line
- Data points (x1,y1),⋯,(xN,yN),xk are distinct
- Numerical method is to defermine y=f(x)
- Linear approximation : y=f(x)=Ax+B
- if there is an error in the data points : f(xk)=yk+ek, where ek=f(xk)−yk is the error
- How to find the best linear approximation that goes near the points?
Errors
- Errors : ek=f(xk)−yk for 1≤k≤N
- Norms to measure how far the curve y=f(x) lies from the data
- Maximun error : E∞(f)=max1≤k≤N|f(k)−yk|
- Average error : E1(f)=1N∑Nk=1|f(xk)−yk|
- Root-mean-square error : E2(f)=√1N∑Nk=1|f(xk)−yk|2
Least-Squares Line
- A best-fitting line is minimizing the error
- RMS is the traditional choice
- Let (xk,yk)Nk=1 be a set of N points
- Least-squares line y=f(x)=Ax+B minimizes Root-Mean-Square error E2(f)
- E2(f) minimum iff ∑Nk=1(Axk+B−yk)2 is minimun
- E(A,B)=∑e2k,
- ∂E∂A=0=∑2(Axk+B−yk)xk→(∑x2k)A+(∑xk)B=∑xkyk
- ∂E∂B=0=∑(Axk+B−yk)→(∑xk)A+NB=∑yk
Power Fit
- Fitting f(x)=AxM, where M is known
- (xk,yk)Nk=1 are N points, E=∑e2k=∑(AxMk−yk)2
- Least-squares power curve : minimize the RMS (Root-Mean-Square) error : A=(∑Nk=1xMkyk)∑Nk=1x2Mk
Data Linearization
- Data linearization : process to transform a non-linear relation into a linear relation
- Fitting y=CeAx, E(A,C)=∑(CeAxk−yk)2, ∂E∂C=0=2∑(Ce2Axk−ykeAk)
- (xk,yk)→(Xk,Yk)
- Take the logarithm of both sides : ln(y)=Ax+ln(C)
- Assign new variables : Y=ln(y), X=x, B=ln(C)
- Linear relation : Y=AX+B -> Xk,Yk=xk,ln(yk)
- Data linearization of exponential functions : (Xk,Yk)=(xk,ln(yk))
- Solve : (\sum_{k=1}^N X_k^2) A + (\sum_{k=1}^N X_k) B = \sum_{k=1}^N X_kY_k = (\sum_{k=1}^N X_k)X + NB = \sum_{k=1}^N Y_k,C = e^B$
Linear Least Squares
- Linear least-squares problem
- N data points (xk,yk)
- a set of M linear independent functions fj(x)
- We want to find M coefficients cj
- Minimize E(c1,c2,⋯,cM)=∑Nk=1(f(xk)−yk)2=∑Nk=1((∑Mj=1cjfj(xk))−yk)2
- Each partial derivative should be zero ∂E∂ci=0 for i=1,2,⋯,M : ∑Nk=1((∑Mj=1cjfj(xk))−yk)(fi(xk))=0 for i=1,2,⋯,M
- M x M linear system for coeffieicnts cj : ∑Mj=1(∑Nk=1fi(xk)fj(xk))cj=∑Nk=1fi(xk)yk for i=1,2,⋯,M
- can represent coefficient cj to matrix FTFC=FTY,
F=[f1(x1)f2(x1)⋯fM(x1)f1(x2)f2(x2)⋯fM(x2)⋮⋮⋮f1(xN)f2(xN)⋯fM(xN)]
Polynomial Fitting
- Polynomial function : f(x)=c1+c2x+c3x2+⋯+cM+1xM
- Least-squares parabola : (xk,yk)Nk=1 are N points , y=f(x)=Ax2+Bx+C
-Minimize sum of squares of errors : E(A,B,C)=∑Nk=1(Ax2k+Bxk+C−yk)2
- The partial derivatives ∂E∂A,∂E∂B,∂E∂C=0
- solving 3 x 3 linear system
Nonlinear Least-Squares Method
- Nomlinear system can be solved with Newton's Metehod : Time consuming, Requires good starting values for A and C
- Minimize E(A,C) directly using optimization methods
'학교수업 > 수치해석' 카테고리의 다른 글
[Numerical Analysis] 13. Solution of Nonlinear Euquations (0) | 2022.06.13 |
---|---|
[Numerical Analysis] 12. Optimization (0) | 2022.05.23 |
[Numerical Analysis] 10. Interpolation (0) | 2022.05.20 |
[Numerical Analysis] 9. Jacobi Method, Gauss-Seidel Method (0) | 2022.05.20 |
[Numerical Analysis] 8. Triangular Systems (0) | 2022.05.20 |