Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Вивчайте Eigenvalues and Eigenvectors | Section
Principal Component Analysis Fundamentals

bookEigenvalues and Eigenvectors

Свайпніть щоб показати меню

Note
Definition

An eigenvector of a matrix is a nonzero vector whose direction remains unchanged when a linear transformation (represented by the matrix) is applied to it; only its length is scaled. The amount of scaling is given by the corresponding eigenvalue.

For covariance matrix Σ\Sigma, eigenvectors point in the directions of maximum variance, and eigenvalues tell you how much variance is in those directions.

Mathematically, for matrix AA, eigenvector vv and eigenvalue λλ:

Av=λvA v = \lambda v

In PCA, the eigenvectors of the covariance matrix are the principal axes, and the eigenvalues are the variances along those axes.

12345678910111213
import numpy as np # Using the covariance matrix from the previous code X = np.array([[2.5, 2.4], [0.5, 0.7], [2.2, 2.9]]) X_centered = X - np.mean(X, axis=0) cov_matrix = (X_centered.T @ X_centered) / X_centered.shape[0] # Compute eigenvalues and eigenvectors values, vectors = np.linalg.eig(cov_matrix) print("Eigenvalues:", values) print("Eigenvectors:\n", vectors)
copy
Note
Note

The eigenvector with the largest eigenvalue points in the direction of greatest variance in the data. This is the first principal component.

question mark

What is the role of eigenvalues and eigenvectors of the covariance matrix in PCA

Виберіть правильну відповідь

Все було зрозуміло?

Як ми можемо покращити це?

Дякуємо за ваш відгук!

Секція 1. Розділ 6

Запитати АІ

expand

Запитати АІ

ChatGPT

Запитайте про що завгодно або спробуйте одне із запропонованих запитань, щоб почати наш чат

Секція 1. Розділ 6
some-alt