linear algebra30 min

Eigenvalues & Eigenvectors

Special vectors that a matrix only scales, revealing the fundamental behavior of linear transformations

0/9Not Started

Why This Matters

Most vectors change direction when you multiply them by a matrix. But some special vectors only get scaled -- they point in the same direction (or exactly opposite) after the transformation. These are eigenvectors, and the scale factor is the eigenvalue. If Av = lambdav, then v is an eigenvector of A with eigenvalue lambda. This seemingly simple equation is one of the most important in all of mathematics.

Eigenvalues and eigenvectors reveal the intrinsic behavior of a transformation. A matrix might look complicated, but its eigenvectors tell you the natural directions along which the transformation simply stretches or compresses. In those directions, the complex operation of matrix multiplication reduces to simple scalar multiplication. This is why eigenvalues appear everywhere: in Google PageRank (the dominant eigenvector of the web graph), in quantum mechanics (energy levels are eigenvalues), in vibration analysis (natural frequencies), and in principal component analysis for data science.

Finding eigenvalues requires solving the characteristic polynomial det(A - lambdaI) = 0. For a 2x2 matrix, this gives a quadratic equation. For a 3x3, a cubic. The roots of this polynomial are the eigenvalues. Once you have an eigenvalue, you find the corresponding eigenvectors by solving the system (A - lambdaI)*v = 0, which is a standard null space computation.

Define Terms

Visual Model

Matrix An x n
Characteristic Polydet(A - lambda*I) = 0
EigenvaluesRoots of char. poly
Solve (A - lambda*I)v = 0Null space computation
EigenvectorsAv = lambda*v
DiagonalizationA = PDP^(-1)
Matrix PowersA^k = PD^kP^(-1)

The full process at a glance. Click Start tour to walk through each step.

Eigenvalues come from the characteristic polynomial. Eigenvectors come from the null space of (A - lambda*I). Together, they diagonalize the matrix.

Code Example

Code
// Eigenvalues and Eigenvectors in JavaScript

// For a 2x2 matrix [[a,b],[c,d]]:
// Characteristic polynomial: lambda^2 - (a+d)*lambda + (ad-bc) = 0
// That is: lambda^2 - trace*lambda + det = 0

function eigenvalues2x2(A) {
  const trace = A[0][0] + A[1][1];
  const det = A[0][0] * A[1][1] - A[0][1] * A[1][0];
  const discriminant = trace * trace - 4 * det;
  if (discriminant < 0) {
    return { real: false, msg: "Complex eigenvalues" };
  }
  const sqrtD = Math.sqrt(discriminant);
  return [
    (trace + sqrtD) / 2,
    (trace - sqrtD) / 2
  ];
}

const A = [[4, 1], [2, 3]];
const evals = eigenvalues2x2(A);
console.log("Eigenvalues of A:", evals); // [5, 2]

// Verify: Av = lambda * v
// For lambda = 5, solve (A - 5I)v = 0:
// [[-1, 1], [2, -2]] -> v = [1, 1] (or any scalar multiple)
function matVecMul(M, v) {
  return [
    M[0][0] * v[0] + M[0][1] * v[1],
    M[1][0] * v[0] + M[1][1] * v[1]
  ];
}

const v1 = [1, 1];
console.log("A * [1,1]:", matVecMul(A, v1));     // [5, 5]
console.log("5 * [1,1]:", [5 * v1[0], 5 * v1[1]]); // [5, 5] -- matches!

// For lambda = 2, solve (A - 2I)v = 0:
// [[2, 1], [2, 1]] -> v = [1, -2]
const v2 = [1, -2];
console.log("A * [1,-2]:", matVecMul(A, v2));       // [2, -4]
console.log("2 * [1,-2]:", [2 * v2[0], 2 * v2[1]]); // [2, -4] -- matches!

// Check: non-eigenvector changes direction
const w = [1, 0];
console.log("A * [1,0]:", matVecMul(A, w)); // [4, 2] -- not a scalar multiple of [1,0]

Interactive Experiment

Try these exercises:

  • Compute the eigenvalues of [[2, 0], [0, 3]]. This is a diagonal matrix. What do you notice about the eigenvalues and the diagonal entries?
  • Compute the eigenvalues of [[0, -1], [1, 0]]. This is a 90-degree rotation. Why are there no real eigenvalues? (No vector keeps its direction under rotation.)
  • For A = [[3, 1], [0, 2]], find the eigenvalues, then find an eigenvector for each. Verify that Av = lambda*v for each pair.
  • Multiply an eigenvector by different scalars (2v, -3v, 0.5v). Are the results still eigenvectors? Why?
  • Take A = [[2, 1], [1, 2]]. Compute the eigenvalues and verify that their sum equals the trace and their product equals the determinant.

Quick Quiz

Coding Challenge

Eigenvalue Verifier

Write a function called `verifyEigen(A, eigenvalue, eigenvector)` that returns true if Av = lambda*v (within a small tolerance of 0.0001). A is a 2x2 matrix, eigenvalue is a number, and eigenvector is a 2-element array. Compute A*eigenvector and compare each component to eigenvalue*eigenvector[i].

Loading editor...

Real-World Usage

Eigenvalues and eigenvectors are one of the most applied concepts in science and engineering:

  • Google PageRank: The ranking of web pages is the dominant eigenvector of the web link matrix. Pages with high eigenvector centrality are ranked higher.
  • Principal Component Analysis (PCA): The principal components of a dataset are the eigenvectors of the covariance matrix. The largest eigenvalues correspond to the directions of greatest variance.
  • Quantum mechanics: Observable quantities (energy, momentum) are eigenvalues of their corresponding operators. Measurement collapses the state to an eigenvector.
  • Vibration analysis: The natural frequencies of a bridge, building, or guitar string are eigenvalues of the system stiffness matrix. Resonance occurs when excitation matches an eigenvalue.
  • Markov chains: The steady-state distribution of a Markov chain is the eigenvector corresponding to eigenvalue 1 of the transition matrix.

Connections