Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Root-Finding Algorithms: Bisection, Newton–Raphson, Secant | Core Numerical Algorithms
Numerical Methods for Scientific Computing with Python

bookRoot-Finding Algorithms: Bisection, Newton–Raphson, Secant

Root-finding algorithms are essential tools for solving nonlinear equations of the form f(x)=0f(x) = 0. Three classical methods stand out for their practical use: the bisection method, the Newton–Raphson method, and the secant method. Each algorithm approaches the problem differently, offering distinct advantages and trade-offs in terms of efficiency, speed, and robustness.

The bisection method is a bracketing technique that requires an initial interval [a,b][a, b] where the function changes sign, i.e., f(a)f(b)<0f(a) * f(b) < 0. The method repeatedly bisects the interval and selects the subinterval in which the sign change occurs. This process is continued until the interval is sufficiently small. The bisection method is guaranteed to converge if the starting interval contains a root, but its convergence is relatively slow.

The Newton–Raphson method is an open method that uses the function's derivative to approximate the root. Starting from an initial guess x0x0, it iteratively applies the formula xn+1=xnf(xn)/f(xn)x_{n+1} = x_n - f(x_n)/f'(x_n). This method typically converges much faster than bisection, especially when the initial guess is close to the actual root. However, it requires the computation of the derivative and may fail to converge if the guess is poor or the function behaves badly.

The secant method is similar to Newton–Raphson but does not require the analytical derivative. Instead, it approximates the derivative using two previous points. The update rule is xn+1=xnf(xn)(xnxn1)/(f(xn)f(xn1))x_{n+1} = x_n - f(x_n) * (x_n - x_{n-1}) / (f(x_n) - f(x_{n-1})). This method often converges faster than bisection and is more practical when the derivative is difficult to compute.

Convergence criteria for all these methods typically involve checking whether the absolute value of f(x)f(x) is below a specified tolerance or whether the change in xx between iterations is sufficiently small. Choosing the appropriate method depends on the problem's nature, such as the availability of derivatives and the need for guaranteed convergence.

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354
import numpy as np def f(x): return x**3 - x - 2 def df(x): return 3*x**2 - 1 def bisection(f, a, b, tol=1e-6, max_iter=100): if f(a) * f(b) >= 0: raise ValueError("f(a) and f(b) must have opposite signs") for i in range(max_iter): c = (a + b) / 2 if abs(f(c)) < tol or (b - a) / 2 < tol: return c if f(a) * f(c) < 0: b = c else: a = c raise RuntimeError("Bisection did not converge") def newton_raphson(f, df, x0, tol=1e-6, max_iter=100): x = x0 for i in range(max_iter): fx = f(x) dfx = df(x) if dfx == 0: raise ZeroDivisionError("Derivative is zero") x_new = x - fx / dfx if abs(x_new - x) < tol: return x_new x = x_new raise RuntimeError("Newton–Raphson did not converge") def secant(f, x0, x1, tol=1e-6, max_iter=100): for i in range(max_iter): fx0 = f(x0) fx1 = f(x1) if fx1 - fx0 == 0: raise ZeroDivisionError("Zero denominator in secant method") x2 = x1 - fx1 * (x1 - x0) / (fx1 - fx0) if abs(x2 - x1) < tol: return x2 x0, x1 = x1, x2 raise RuntimeError("Secant method did not converge") # Example usage: root_bisection = bisection(f, 1, 2) root_newton = newton_raphson(f, df, 1.5) root_secant = secant(f, 1, 2) print(f"Bisection root: {root_bisection:.6f}") print(f"Newton–Raphson root: {root_newton:.6f}") print(f"Secant root: {root_secant:.6f}")
copy
Bisection Method
expand arrow
  • Always converges if the initial interval brackets a root;
  • Convergence is linear and relatively slow;
  • Requires function values of opposite sign at interval endpoints;
  • Does not require the derivative.
Newton–Raphson Method
expand arrow
  • Converges quadratically if the initial guess is close to the root;
  • May diverge or fail if the guess is poor or the derivative is zero;
  • Requires evaluation of both the function and its derivative;
  • Typically much faster than bisection when applicable.
Secant Method
expand arrow
  • Converges faster than bisection, usually superlinearly but not as fast as Newton–Raphson;
  • Does not require the analytical derivative, only function evaluations;
  • May fail to converge for some functions or starting values;
  • More efficient than bisection when derivative is unavailable.
question mark

Which root-finding method would you choose if you do not have access to the derivative of the function, but you require more rapid convergence than bisection?

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

Section 2. Chapter 3

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

bookRoot-Finding Algorithms: Bisection, Newton–Raphson, Secant

Swipe to show menu

Root-finding algorithms are essential tools for solving nonlinear equations of the form f(x)=0f(x) = 0. Three classical methods stand out for their practical use: the bisection method, the Newton–Raphson method, and the secant method. Each algorithm approaches the problem differently, offering distinct advantages and trade-offs in terms of efficiency, speed, and robustness.

The bisection method is a bracketing technique that requires an initial interval [a,b][a, b] where the function changes sign, i.e., f(a)f(b)<0f(a) * f(b) < 0. The method repeatedly bisects the interval and selects the subinterval in which the sign change occurs. This process is continued until the interval is sufficiently small. The bisection method is guaranteed to converge if the starting interval contains a root, but its convergence is relatively slow.

The Newton–Raphson method is an open method that uses the function's derivative to approximate the root. Starting from an initial guess x0x0, it iteratively applies the formula xn+1=xnf(xn)/f(xn)x_{n+1} = x_n - f(x_n)/f'(x_n). This method typically converges much faster than bisection, especially when the initial guess is close to the actual root. However, it requires the computation of the derivative and may fail to converge if the guess is poor or the function behaves badly.

The secant method is similar to Newton–Raphson but does not require the analytical derivative. Instead, it approximates the derivative using two previous points. The update rule is xn+1=xnf(xn)(xnxn1)/(f(xn)f(xn1))x_{n+1} = x_n - f(x_n) * (x_n - x_{n-1}) / (f(x_n) - f(x_{n-1})). This method often converges faster than bisection and is more practical when the derivative is difficult to compute.

Convergence criteria for all these methods typically involve checking whether the absolute value of f(x)f(x) is below a specified tolerance or whether the change in xx between iterations is sufficiently small. Choosing the appropriate method depends on the problem's nature, such as the availability of derivatives and the need for guaranteed convergence.

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354
import numpy as np def f(x): return x**3 - x - 2 def df(x): return 3*x**2 - 1 def bisection(f, a, b, tol=1e-6, max_iter=100): if f(a) * f(b) >= 0: raise ValueError("f(a) and f(b) must have opposite signs") for i in range(max_iter): c = (a + b) / 2 if abs(f(c)) < tol or (b - a) / 2 < tol: return c if f(a) * f(c) < 0: b = c else: a = c raise RuntimeError("Bisection did not converge") def newton_raphson(f, df, x0, tol=1e-6, max_iter=100): x = x0 for i in range(max_iter): fx = f(x) dfx = df(x) if dfx == 0: raise ZeroDivisionError("Derivative is zero") x_new = x - fx / dfx if abs(x_new - x) < tol: return x_new x = x_new raise RuntimeError("Newton–Raphson did not converge") def secant(f, x0, x1, tol=1e-6, max_iter=100): for i in range(max_iter): fx0 = f(x0) fx1 = f(x1) if fx1 - fx0 == 0: raise ZeroDivisionError("Zero denominator in secant method") x2 = x1 - fx1 * (x1 - x0) / (fx1 - fx0) if abs(x2 - x1) < tol: return x2 x0, x1 = x1, x2 raise RuntimeError("Secant method did not converge") # Example usage: root_bisection = bisection(f, 1, 2) root_newton = newton_raphson(f, df, 1.5) root_secant = secant(f, 1, 2) print(f"Bisection root: {root_bisection:.6f}") print(f"Newton–Raphson root: {root_newton:.6f}") print(f"Secant root: {root_secant:.6f}")
copy
Bisection Method
expand arrow
  • Always converges if the initial interval brackets a root;
  • Convergence is linear and relatively slow;
  • Requires function values of opposite sign at interval endpoints;
  • Does not require the derivative.
Newton–Raphson Method
expand arrow
  • Converges quadratically if the initial guess is close to the root;
  • May diverge or fail if the guess is poor or the derivative is zero;
  • Requires evaluation of both the function and its derivative;
  • Typically much faster than bisection when applicable.
Secant Method
expand arrow
  • Converges faster than bisection, usually superlinearly but not as fast as Newton–Raphson;
  • Does not require the analytical derivative, only function evaluations;
  • May fail to converge for some functions or starting values;
  • More efficient than bisection when derivative is unavailable.
question mark

Which root-finding method would you choose if you do not have access to the derivative of the function, but you require more rapid convergence than bisection?

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

Section 2. Chapter 3
some-alt