Topics
- Solution of Nonlinear Equations
> Bracketing Methods
> Bisection Method
- Open Methods
> Newton-Raphson Method
> Secant Method
Basis of Bisectio Method
- An equation $f(x) =0$, where $f(x)$ is real continuous function, has at least one root between $x_1$ and $x_u$ if $f(x_1) f(x_u)<0$.
- If function $f(x)$ does not change sign between two points.
- If funtion $f(x)$ does not change sign between two points
> roots of the equation may still exist between the two points
> there may not be any roots for the equation between the two points
Step 1
- Choose $x_l$ and $x_u$ as two guesses for the root such that $f(x_l)f(x_u) < 0$, or in other words, $f(x)$ changes sign between $x_l$ and $x_u$
Step 2
- Estimate the root, $x_m$ of the equation $f(x)=0$ as the mid point between $x_l$ and $x_u$
Step 3
- Now check the following
a) if $f(x_l)f(x_m) < 0$, then the root lies between $x_l$ and $x_m$ => $x_l=x_l$, $x_u=x_m$
b) if $f(x_l)f(x_m)$ >0$, then the root lies between $x_m$ and $x_u$ => $x_l = x_m$, $x_u=x_u$
c) if $f(x_l)f(x_m)$ =0$, then the root is $x_m$. Stop the algorithm if this is true.
Step 4
- find the new estimate of the root
> $x_m = \frac{x_l + x_u}{2}$
- Find the absolute relative approximate error
Step 5
- compare the absolute relative approximate error $|\epsilon_a|$ with the pre-specified error tolerance
- We should also check whether the number of iterations is more than the maximum number of iterations allowed. if so, we terminate the algorithm.
Determining Initial Guesses
- Incremental search
> Make function evaluations at small increments across the region
> Choice of the increment length is critical
Too small => time consuming
Too great => missing roots
- Advanagee
> always convergent
> The root bracket gets balved with each iteration - guaranteed.
- drawbacks
> Slow convergence
> If one of the initial guesses is close to the root, the convergence is slower
Newton-Raphson Method
- If the initial guess at the root is $x_i$, a tangent can be extended from the point [$x_i$, $f(x_i)$]. The point where this tangent crosses the x-axis usually represents an improved estimate of the root
- $f'(x_i) = \frac{f(x_i)}{x_i - x_{i+1}}$
1. evaluate $f'(x)$
2. Use an initial guess $x_i$, to estimate the new value of the root $x_{i+1}$
> $x_{i+1} = x_i - \frac{f(x_i)}{f'(x_i)}$
3. Find the absolute relative approximate error
> |\epsilon_a| = |\frac{x_{i+1}-x_i}{x_{i+1}}| \times 100$
4. Compare the absolute relative approximate error with the pre-specified relative error tolerance $\epsilon_s$ Also check if the number of iterations has exceeded the maximum number of iterations allowed. If so, we terminate the algorithm.
- Adventages
> Converges fast, if it converges.
> Requires only one guess
- Drawbacks
> It could slowly converges
> It could fail
> Division by zero
Secant Method
- Approximate derivatives
> Newton's method (1)
> Approximate the derivative (2)
> Substituting Equation (2) into Equation (1) gives the Secant method
$x_{i+1} = x_i - \frac{f(x_i) (x_i - x_{i-1} }{f(x) - f(x_{i-1})}$
-Advantages
> Converges fast, if it converges
> Requires two guesses that do not need to bracket the root
- Drawbacks
> It could fail
> Division by zero
'학교수업 > 수치해석' 카테고리의 다른 글
[Numerical Analysis] 14. Ordinary Differential Equations (0) | 2022.06.14 |
---|---|
[Numerical Analysis] 12. Optimization (0) | 2022.05.23 |
[Numerical Analysis] 11. Least-Squares (0) | 2022.05.23 |
[Numerical Analysis] 10. Interpolation (0) | 2022.05.20 |
[Numerical Analysis] 9. Jacobi Method, Gauss-Seidel Method (0) | 2022.05.20 |