Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Вивчайте Taylor Expansion for Multivariate Functions | Section
Understanding Multivariate Calculus

bookTaylor Expansion for Multivariate Functions

Свайпніть щоб показати меню

The Taylor expansion is a powerful tool that allows you to approximate a function near a specific point using its derivatives. For functions of a single variable, you have already seen how the Taylor series provides a polynomial that closely matches the function near a point. When you move to functions of several variables, the idea is similar, but the formula involves gradients and higher-order derivatives like the Hessian matrix.

For a function f(x,y)f(x, y) that is sufficiently differentiable, the second-order Taylor expansion about the point (a,b)(a, b) is:

f(x,y)f(a,b)Base value+fx(a,b)(xa)+fy(a,b)(yb)Linear approximation (1st order)+12[fxx(a,b)(xa)2+2fxy(a,b)(xa)(yb)+fyy(a,b)(yb)2]Quadratic correction (2nd order)\begin{aligned} f(x, y) \approx \underbrace{f(a, b)}_{\text{Base value}} &+ \underbrace{f_x(a, b)(x - a) + f_y(a, b)(y - b)}_{\text{Linear approximation (1st order)}} \\ &+ \underbrace{\frac{1}{2} \Big[ f_{xx}(a, b)(x - a)^2 + 2f_{xy}(a, b)(x - a)(y - b) + f_{yy}(a, b)(y - b)^2 \Big]}_{\text{Quadratic correction (2nd order)}} \end{aligned}

Here's how you build this expansion step by step:

  1. Evaluate the function at the expansion point: f(a,b)f(a, b);
  2. Add the first-order terms, which use the gradient (the vector of partial derivatives): fx(a,b)(xa)f_x(a, b)*(x - a) and fy(a,b)(yb)f_y(a, b)*(y - b);
  3. Add the second-order terms, which use the second partial derivatives (entries of the Hessian matrix): fxx(a,b)(xa)2f_{xx}(a, b)*(x - a)^2, fyy(a,b)(yb)2f_{yy}(a, b)*(y - b)^2, and the mixed partial fxy(a,b)(xa)(yb)f_{xy}(a, b)*(x - a)*(y - b);
  4. Each second-order term is multiplied by 0.50.5 to account for the Taylor expansion formula.

This approximation gives you a quadratic surface that matches the function's value, slope, and curvature at the point (a,b)(a, b). The more derivatives you include, the more accurate your approximation becomes near the expansion point.

1234567891011121314151617181920212223242526272829
import sympy as sp import numpy as np # 1. Define symbols and the function x, y = sp.symbols('x y') f = x**2 * y + y**3 # 2. SymPy automatically computes the Gradient and the Hessian matrix grad_f = sp.Matrix([sp.diff(f, x), sp.diff(f, y)]) hessian_f = sp.hessian(f, (x, y)) # 3. Set the coordinates a, b = 1.0, 2.0 x_val, y_val = 1.1, 2.05 # 4. Substitute the base point (a, b) and convert to NumPy arrays subs_dict = {x: a, y: b} f_ab = float(f.subs(subs_dict)) grad_ab = np.array(grad_f.subs(subs_dict), dtype=float).flatten() # Vector (2,) hessian_ab = np.array(hessian_f.subs(subs_dict), dtype=float) # Matrix (2, 2) # 5. Create the difference vector (Delta) delta = np.array([x_val - a, y_val - b]) # 6. Vector-matrix Taylor formula (using @ for matrix multiplication) taylor_approx = f_ab + grad_ab @ delta + 0.5 * (delta @ hessian_ab @ delta) print(f"Second-order Taylor approximation at ({x_val}, {y_val}): {taylor_approx:.4f}")
copy
question mark

Which of the following statements best describes the purpose of the multivariate Taylor expansion?

Виберіть правильну відповідь

Все було зрозуміло?

Як ми можемо покращити це?

Дякуємо за ваш відгук!

Секція 1. Розділ 9

Запитати АІ

expand

Запитати АІ

ChatGPT

Запитайте про що завгодно або спробуйте одне із запропонованих запитань, щоб почати наш чат

Секція 1. Розділ 9
some-alt