Why This Matters
A matrix is a rectangular grid of numbers, and it is the single most important data structure in scientific computing. When a neural network processes an image, it multiplies the pixel data by weight matrices at every layer. When a game engine renders a scene, it transforms every vertex through a chain of 4x4 matrices. When Google ranks web pages, it multiplies a billion-entry vector by a massive matrix over and over.
Matrix multiplication is the engine that powers all of this. Unlike regular number multiplication, matrix multiplication is not commutative -- the order matters. A times B is generally not the same as B times A. This asymmetry is not a quirk; it reflects the fact that applying transformation A then B is different from applying B then A. Understanding how rows meet columns in multiplication is essential.
The transpose of a matrix flips it across its diagonal, turning rows into columns and columns into rows. Transposition appears everywhere: in computing dot products, in defining symmetric matrices, in the normal equations for least-squares regression, and in the definition of orthogonal matrices. It is a simple operation with deep consequences.
Define Terms
Visual Model
The full process at a glance. Click Start tour to walk through each step.
Matrix multiplication combines rows of A with columns of B. The transpose flips a matrix across its diagonal.
Code Example
// Matrix operations in JavaScript
// Matrices are arrays of row-arrays
const A = [[1, 2], [3, 4]];
const B = [[5, 6], [7, 8]];
// Matrix addition
function matAdd(A, B) {
return A.map((row, i) => row.map((val, j) => val + B[i][j]));
}
console.log("A + B:", matAdd(A, B)); // [[6,8],[10,12]]
// Transpose
function transpose(A) {
const rows = A.length, cols = A[0].length;
const result = [];
for (let j = 0; j < cols; j++) {
result[j] = [];
for (let i = 0; i < rows; i++) {
result[j][i] = A[i][j];
}
}
return result;
}
console.log("A^T:", transpose(A)); // [[1,3],[2,4]]
// Matrix multiplication
function matMul(A, B) {
const m = A.length;
const n = B[0].length;
const k = B.length;
const C = Array.from({length: m}, () => new Array(n).fill(0));
for (let i = 0; i < m; i++) {
for (let j = 0; j < n; j++) {
for (let p = 0; p < k; p++) {
C[i][j] += A[i][p] * B[p][j];
}
}
}
return C;
}
console.log("A * B:", matMul(A, B)); // [[19,22],[43,50]]
// Verify: AB is not equal to BA
console.log("B * A:", matMul(B, A)); // [[23,34],[31,46]]
console.log("AB === BA?", false); // Not commutative!Interactive Experiment
Try these exercises:
- Multiply [[1,0],[0,1]] by any 2x2 matrix. What do you get? This is the identity matrix -- the matrix equivalent of multiplying by 1.
- Try multiplying a 2x3 matrix by a 3x2 matrix. What size is the result? Now try the reverse order. What changes?
- Transpose a 3x2 matrix. Verify that (A^T)^T gives you back the original A.
- Compute A * A^T for A = [[1,2],[3,4]]. Is the result symmetric? Why?
- Create a 3x3 matrix where A[i][j] = A[j][i] for all entries. Verify that transposing it changes nothing.
Quick Quiz
Coding Challenge
Write a function called `matMul(A, B)` that takes two matrices (2D arrays) and returns their product. Matrix A has dimensions m x n and matrix B has dimensions n x p. The result should be an m x p matrix where result[i][j] is the dot product of row i of A with column j of B.
Real-World Usage
Matrix operations are the computational backbone of modern technology:
- Deep learning: Neural network forward passes are chains of matrix multiplications. GPUs are designed specifically to do these multiplications fast. A single GPT inference involves billions of matrix operations.
- Computer graphics: Every 3D transformation (rotation, scaling, translation, projection) is a matrix multiplication. The graphics pipeline multiplies model, view, and projection matrices to render each frame.
- Data science: Datasets are naturally represented as matrices (rows are samples, columns are features). Operations like covariance, correlation, and PCA are all matrix operations.
- Cryptography: Many encryption schemes use matrix operations over finite fields. Hill ciphers directly encode messages as vectors and encrypt by matrix multiplication.
- Physics simulation: Finite element analysis, fluid dynamics, and structural engineering all reduce to solving large systems of linear equations, which means massive matrix computations.