Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Learn Eigenvalues and Eigenvectors | Mathematical Foundations of PCA
Dimensionality Reduction with PCA

bookEigenvalues and Eigenvectors

Note
Definition

An eigenvector of a matrix is a nonzero vector whose direction remains unchanged when a linear transformation (represented by the matrix) is applied to it; only its length is scaled. The amount of scaling is given by the corresponding eigenvalue.

For covariance matrix Ξ£\Sigma, eigenvectors point in the directions of maximum variance, and eigenvalues tell you how much variance is in those directions.

Mathematically, for matrix AA, eigenvector vv and eigenvalue λλ:

Av=Ξ»vA v = \lambda v

In PCA, the eigenvectors of the covariance matrix are the principal axes, and the eigenvalues are the variances along those axes.

12345678910111213
import numpy as np # Using the covariance matrix from the previous code X = np.array([[2.5, 2.4], [0.5, 0.7], [2.2, 2.9]]) X_centered = X - np.mean(X, axis=0) cov_matrix = (X_centered.T @ X_centered) / X_centered.shape[0] # Compute eigenvalues and eigenvectors values, vectors = np.linalg.eig(cov_matrix) print("Eigenvalues:", values) print("Eigenvectors:\n", vectors)
copy
Note
Note

The eigenvector with the largest eigenvalue points in the direction of greatest variance in the data. This is the first principal component.

question mark

What is the role of eigenvalues and eigenvectors of the covariance matrix in PCA

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 2. ChapterΒ 2

Ask AI

expand

Ask AI

ChatGPT

Ask anything or try one of the suggested questions to begin our chat

Awesome!

Completion rate improved to 8.33

bookEigenvalues and Eigenvectors

Swipe to show menu

Note
Definition

An eigenvector of a matrix is a nonzero vector whose direction remains unchanged when a linear transformation (represented by the matrix) is applied to it; only its length is scaled. The amount of scaling is given by the corresponding eigenvalue.

For covariance matrix Ξ£\Sigma, eigenvectors point in the directions of maximum variance, and eigenvalues tell you how much variance is in those directions.

Mathematically, for matrix AA, eigenvector vv and eigenvalue λλ:

Av=Ξ»vA v = \lambda v

In PCA, the eigenvectors of the covariance matrix are the principal axes, and the eigenvalues are the variances along those axes.

12345678910111213
import numpy as np # Using the covariance matrix from the previous code X = np.array([[2.5, 2.4], [0.5, 0.7], [2.2, 2.9]]) X_centered = X - np.mean(X, axis=0) cov_matrix = (X_centered.T @ X_centered) / X_centered.shape[0] # Compute eigenvalues and eigenvectors values, vectors = np.linalg.eig(cov_matrix) print("Eigenvalues:", values) print("Eigenvectors:\n", vectors)
copy
Note
Note

The eigenvector with the largest eigenvalue points in the direction of greatest variance in the data. This is the first principal component.

question mark

What is the role of eigenvalues and eigenvectors of the covariance matrix in PCA

Select the correct answer

Everything was clear?

How can we improve it?

Thanks for your feedback!

SectionΒ 2. ChapterΒ 2
some-alt