Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Leer Eigenvalues and Eigenvectors | Section
Principal Component Analysis Fundamentals

bookEigenvalues and Eigenvectors

Veeg om het menu te tonen

Note
Definition

An eigenvector of a matrix is a nonzero vector whose direction remains unchanged when a linear transformation (represented by the matrix) is applied to it; only its length is scaled. The amount of scaling is given by the corresponding eigenvalue.

For covariance matrix Σ\Sigma, eigenvectors point in the directions of maximum variance, and eigenvalues tell you how much variance is in those directions.

Mathematically, for matrix AA, eigenvector vv and eigenvalue λλ:

Av=λvA v = \lambda v

In PCA, the eigenvectors of the covariance matrix are the principal axes, and the eigenvalues are the variances along those axes.

12345678910111213
import numpy as np # Using the covariance matrix from the previous code X = np.array([[2.5, 2.4], [0.5, 0.7], [2.2, 2.9]]) X_centered = X - np.mean(X, axis=0) cov_matrix = (X_centered.T @ X_centered) / X_centered.shape[0] # Compute eigenvalues and eigenvectors values, vectors = np.linalg.eig(cov_matrix) print("Eigenvalues:", values) print("Eigenvectors:\n", vectors)
copy
Note
Note

The eigenvector with the largest eigenvalue points in the direction of greatest variance in the data. This is the first principal component.

question mark

What is the role of eigenvalues and eigenvectors of the covariance matrix in PCA

Selecteer het correcte antwoord

Was alles duidelijk?

Hoe kunnen we het verbeteren?

Bedankt voor je feedback!

Sectie 1. Hoofdstuk 6

Vraag AI

expand

Vraag AI

ChatGPT

Vraag wat u wilt of probeer een van de voorgestelde vragen om onze chat te starten.

Sectie 1. Hoofdstuk 6
some-alt