linear algebra25 min

Linear Transformations

Functions that preserve vector addition and scalar multiplication, and their matrix representations

0/9Not Started

Why This Matters

A linear transformation is a function between vector spaces that preserves the two fundamental operations: vector addition and scalar multiplication. If you add two vectors and then transform the sum, you get the same result as transforming each vector first and then adding. This seemingly simple property is incredibly powerful -- it means that understanding what a transformation does to a few basis vectors tells you what it does to every vector in the space.

Every matrix multiplication is a linear transformation, and every linear transformation between finite-dimensional spaces can be represented as a matrix. Rotations, reflections, scalings, projections, and shears are all linear transformations. When a GPU transforms millions of vertices per frame, it is applying linear transformations. When a neural network multiplies inputs by weight matrices, each layer is a linear transformation (before the nonlinearity).

The kernel (null space) of a transformation is the set of all vectors that map to zero. The image (range) is the set of all possible outputs. These two sets tell you everything about what the transformation destroys and what it can produce. A transformation is one-to-one exactly when its kernel contains only the zero vector, and it is onto exactly when its image is the entire target space.

Define Terms

Visual Model

Input SpaceDomain vectors
Matrix ARepresents T
Output SpaceCodomain vectors
Kernel (Null Space)T(v) = 0
Image (Range)All possible T(v)
RotationPreserves lengths
ProjectionDrops a dimension
ScalingStretches axes

The full process at a glance. Click Start tour to walk through each step.

A linear transformation maps input vectors to output vectors via matrix multiplication. The kernel is what gets destroyed; the image is what gets produced.

Code Example

Code
// Linear Transformations in JavaScript

// Apply a 2x2 transformation matrix to a 2D vector
function transform(A, v) {
  return [
    A[0][0] * v[0] + A[0][1] * v[1],
    A[1][0] * v[0] + A[1][1] * v[1]
  ];
}

// Rotation by theta radians
function rotationMatrix(theta) {
  const c = Math.cos(theta);
  const s = Math.sin(theta);
  return [[c, -s], [s, c]];
}

// Rotate [1, 0] by 90 degrees (pi/2)
const R90 = rotationMatrix(Math.PI / 2);
console.log("Rotate [1,0] by 90 deg:",
  transform(R90, [1, 0]).map(v => Math.round(v))); // [0, 1]

// Projection onto x-axis
const projX = [[1, 0], [0, 0]];
console.log("Project [3,4] onto x:", transform(projX, [3, 4])); // [3, 0]

// Scaling
const scale2x = [[2, 0], [0, 2]];
console.log("Scale [3,4] by 2:", transform(scale2x, [3, 4])); // [6, 8]

// Reflection across y-axis
const reflectY = [[-1, 0], [0, 1]];
console.log("Reflect [3,4] across y:", transform(reflectY, [3, 4])); // [-3, 4]

// Transform a set of points (a square)
const square = [[0,0], [1,0], [1,1], [0,1]];
const shear = [[1, 1], [0, 1]];
console.log("Sheared square:");
square.forEach(p => console.log(" ", transform(shear, p)));
// [0,0], [1,0], [2,1], [1,1] -- parallelogram!

// Verify linearity: T(u+v) = T(u) + T(v)
const u = [1, 2], v = [3, -1];
const Tu = transform(shear, u);
const Tv = transform(shear, v);
const Tuv = transform(shear, [u[0]+v[0], u[1]+v[1]]);
console.log("T(u+v):", Tuv);
console.log("T(u)+T(v):", [Tu[0]+Tv[0], Tu[1]+Tv[1]]); // Same!

Interactive Experiment

Try these exercises:

  • Apply the rotation matrix for 45 degrees to the vector [1, 0]. What do you get? Is the length preserved?
  • Apply the projection matrix [[1,0],[0,0]] to several vectors. What is the kernel? What is the image?
  • Compose two transformations by multiplying their matrices: first rotate 90 degrees, then reflect across the x-axis. Does the order matter?
  • Create a shear matrix [[1, k], [0, 1]] for different values of k. Apply it to a unit square and observe the parallelogram.
  • Verify that rotation preserves the dot product of two vectors: u dot v = T(u) dot T(v).

Quick Quiz

Coding Challenge

Transformation Applier

Write a function called `applyTransform(A, points)` that takes a 2x2 transformation matrix A and an array of 2D points, and returns a new array with each point transformed by A. Each point is a 2-element array [x, y], and the result for each point is [A[0][0]*x + A[0][1]*y, A[1][0]*x + A[1][1]*y].

Loading editor...

Real-World Usage

Linear transformations are the mathematical foundation of many systems:

  • Computer graphics: Every model-view-projection transformation in 3D rendering is a composition of linear transformations. Rotation, scaling, and perspective are all matrices.
  • Machine learning: Each layer of a neural network applies a linear transformation (weight matrix multiplication) followed by a nonlinear activation. The linear part is what gives the network its capacity.
  • Image processing: Convolutions, blurs, edge detection, and many filters can be understood as linear transformations on the pixel-vector representation of an image.
  • Robotics: Forward and inverse kinematics use chains of linear transformations to compute where a robot arm endpoint is (or should be) based on joint angles.
  • Quantum computing: Quantum gates are unitary linear transformations on qubit state vectors. Every quantum algorithm is a sequence of such transformations.

Connections