linear algebra25 min

Orthogonality

Perpendicular vectors, projections, and the Gram-Schmidt process for building orthogonal bases

0/9Not Started

Why This Matters

Two vectors are orthogonal when they are perpendicular -- their dot product is zero. Orthogonality is the linear algebra version of independence at its strongest: orthogonal vectors do not share any component of their direction. This makes computations dramatically simpler. When you work in an orthogonal basis, decomposing a vector into components requires only dot products instead of solving systems of equations. This is why orthogonality is central to signal processing, data compression, and numerical methods.

Projection is the operation of finding the component of one vector in the direction of another. When you project vector b onto vector a, you get the closest point on the line through a to the point b. This simple geometric operation is the foundation of least-squares regression, Fourier analysis, and every algorithm that finds the "best approximation" in a subspace. Projection answers the question: how much of b points in the direction of a?

The Gram-Schmidt process takes any set of linearly independent vectors and produces an orthogonal (or orthonormal) set that spans the same space. It works iteratively: take the first vector as-is, then subtract the projection of the second onto the first to make them orthogonal, then subtract projections of the third onto both previous vectors, and so on. This algorithm is used constantly in numerical linear algebra, QR factorization, and building the orthogonal matrices that power SVD.

Define Terms

Visual Model

Vector aReference direction
Vector bVector to decompose
Dot Producta . b = sum(ai * bi)
Projectionproj_a(b) = (a.b/a.a)*a
Orthogonal Checka . b = 0?
Residualb - proj_a(b)
Gram-SchmidtIterative orthogonalization
Orthonormal BasisUnit orthogonal vectors

The full process at a glance. Click Start tour to walk through each step.

The dot product tests orthogonality. Projection decomposes a vector along a direction. Gram-Schmidt builds an orthogonal basis from any independent set.

Code Example

Code
// Orthogonality, Projection, and Gram-Schmidt in JavaScript

// Dot product
function dot(a, b) {
  return a.reduce((sum, val, i) => sum + val * b[i], 0);
}

console.log("[1,0] . [0,1]:", dot([1, 0], [0, 1])); // 0 (orthogonal)
console.log("[3,4] . [-4,3]:", dot([3, 4], [-4, 3])); // 0 (orthogonal)
console.log("[1,2] . [3,4]:", dot([1, 2], [3, 4])); // 11 (not orthogonal)

// Check orthogonality
function isOrthogonal(a, b) {
  return Math.abs(dot(a, b)) < 0.0001;
}
console.log("[3,4] orthogonal to [-4,3]?", isOrthogonal([3, 4], [-4, 3])); // true

// Vector projection of b onto a
function project(a, b) {
  const scalar = dot(a, b) / dot(a, a);
  return a.map(v => scalar * v);
}
console.log("proj of [3,4] onto [1,0]:", project([1, 0], [3, 4])); // [3, 0]
console.log("proj of [2,3] onto [1,1]:", project([1, 1], [2, 3])); // [2.5, 2.5]

// Magnitude and normalize
function magnitude(v) {
  return Math.sqrt(dot(v, v));
}
function normalize(v) {
  const mag = magnitude(v);
  return v.map(x => x / mag);
}

// Gram-Schmidt process
function gramSchmidt(vectors) {
  const ortho = [];
  for (const v of vectors) {
    let u = [...v];
    for (const prev of ortho) {
      const proj = project(prev, v);
      u = u.map((val, i) => val - proj[i]);
    }
    ortho.push(u);
  }
  return ortho.map(normalize);
}

const basis = gramSchmidt([[1, 1], [1, 0]]);
console.log("Orthonormal basis:", basis.map(v => v.map(x => +x.toFixed(4))));
// Roughly [[0.7071, 0.7071], [0.7071, -0.7071]]
console.log("Orthogonal check:", isOrthogonal(basis[0], basis[1])); // true

Interactive Experiment

Try these exercises:

  • Find a vector orthogonal to [2, 5]. (Hint: if a . b = 0, try b = [-5, 2].) Verify using the dot product.
  • Project [4, 3] onto [1, 0] and onto [0, 1]. What do you get? These projections are just the x and y components.
  • Project [4, 3] onto [1, 1]. Is the projection closer to [4, 3] or to the origin? Compute the distance to verify.
  • Run Gram-Schmidt on [[1, 1, 0], [1, 0, 1], [0, 1, 1]]. Verify that all three output vectors are mutually orthogonal.
  • What happens if you try Gram-Schmidt on linearly dependent vectors like [[1, 2], [2, 4]]? Why does this fail?

Quick Quiz

Coding Challenge

Dot Product and Projection

Write three functions: (1) `dotProduct(a, b)` that returns the dot product of two vectors, (2) `isOrthogonal(a, b)` that returns true if the dot product is zero (within tolerance 0.0001), and (3) `projectOnto(a, b)` that returns the projection of vector b onto vector a using the formula proj_a(b) = (a.b / a.a) * a.

Loading editor...

Real-World Usage

Orthogonality is one of the most practically useful concepts in applied mathematics:

  • Signal processing: Fourier analysis decomposes signals into orthogonal sinusoidal components. Because they are orthogonal, each frequency can be analyzed independently.
  • Least-squares regression: The best-fit line minimizes the sum of squared residuals. Geometrically, the residual vector is orthogonal to the column space of the data matrix.
  • QR factorization: Gram-Schmidt produces the Q (orthogonal) and R (upper triangular) matrices used in one of the most numerically stable methods for solving linear systems and computing eigenvalues.
  • Compression (JPEG, MP3): These formats use orthogonal transforms (DCT, FFT) to convert data into a basis where most coefficients are near zero, enabling compression by discarding small components.
  • Machine learning: Orthogonal weight initialization in neural networks helps prevent vanishing and exploding gradients during training.

Connections