Eigenvalues and Eigenvectors
メニューを表示するにはスワイプしてください
An eigenvector of a matrix is a nonzero vector whose direction remains unchanged when a linear transformation (represented by the matrix) is applied to it; only its length is scaled. The amount of scaling is given by the corresponding eigenvalue.
For covariance matrix Σ, eigenvectors point in the directions of maximum variance, and eigenvalues tell you how much variance is in those directions.
Mathematically, for matrix A, eigenvector v and eigenvalue λ:
Av=λvIn PCA, the eigenvectors of the covariance matrix are the principal axes, and the eigenvalues are the variances along those axes.
12345678910111213import numpy as np # Using the covariance matrix from the previous code X = np.array([[2.5, 2.4], [0.5, 0.7], [2.2, 2.9]]) X_centered = X - np.mean(X, axis=0) cov_matrix = (X_centered.T @ X_centered) / X_centered.shape[0] # Compute eigenvalues and eigenvectors values, vectors = np.linalg.eig(cov_matrix) print("Eigenvalues:", values) print("Eigenvectors:\n", vectors)
The eigenvector with the largest eigenvalue points in the direction of greatest variance in the data. This is the first principal component.
フィードバックありがとうございます!
AIに質問する
AIに質問する
何でも質問するか、提案された質問の1つを試してチャットを始めてください