mathematics25 min

Dot Product & Norms

Measuring similarity and magnitude with dot products, norms, and cosine similarity

0/9Not Started

Why This Matters

You have two users described by vectors of preferences. How similar are they? You have a word represented as a 300-dimensional vector. Which other words are closest to it? These are questions about dot products and norms.

The dot product measures how much two vectors point in the same direction. The norm measures how long a vector is. Combine them and you get cosine similarity -- the standard way to compare embeddings in search engines, recommendation systems, and large language models. If you want to understand how AI "knows" that "king" is to "queen" as "man" is to "woman," you need to understand these operations.

Define Terms

Visual Model

Vector A[3, 4]
Vector B[1, 2]
Dot ProductA . B = 11
||A|| = 5
||B|| = 2.24
Cosine Sim0.98
Angle thetaProjection

The full process at a glance. Click Start tour to walk through each step.

The dot product and norm combine to give cosine similarity -- the key metric for comparing vectors.

Code Example

Code
// Dot product: multiply components, then sum
function dotProduct(a, b) {
  let sum = 0;
  for (let i = 0; i < a.length; i++) {
    sum += a[i] * b[i];
  }
  return sum;
}

const a = [3, 4];
const b = [1, 2];
console.log("dot:", dotProduct(a, b)); // 11

// L2 Norm (magnitude / length)
function norm(v) {
  return Math.sqrt(v.reduce((sum, x) => sum + x * x, 0));
}
console.log("||a||:", norm(a));  // 5
console.log("||b||:", norm(b));  // ~2.236

// Normalize: make length = 1
function normalize(v) {
  const n = norm(v);
  return v.map(x => x / n);
}
console.log("unit a:", normalize(a)); // [0.6, 0.8]

// Cosine similarity
function cosineSimilarity(a, b) {
  return dotProduct(a, b) / (norm(a) * norm(b));
}
console.log("cosine:", cosineSimilarity(a, b).toFixed(4)); // 0.9839

// Perpendicular vectors have cosine = 0
console.log("perp:", cosineSimilarity([1, 0], [0, 1])); // 0

// Opposite vectors have cosine = -1
console.log("opp:", cosineSimilarity([1, 0], [-1, 0])); // -1

Interactive Experiment

Try these exercises:

  • Compute the dot product of [1, 0] and [0, 1]. Why is it 0? What does that mean geometrically?
  • Calculate the norm of [3, 4] by hand using the Pythagorean theorem. Verify with code.
  • Normalize the vector [6, 8]. Check that the resulting vector has norm 1.
  • Find the cosine similarity between [1, 1] and [2, 2]. Why is it exactly 1?
  • Create two random 100-dimensional vectors. What is their cosine similarity? Is it close to 0?

Quick Quiz

Coding Challenge

Most Similar Pair

Write a function called `mostSimilarPair` that takes an array of vectors (arrays of numbers) and returns the indices [i, j] of the two vectors with the highest cosine similarity. Assume all vectors have the same length and there are at least 2 vectors.

Loading editor...

Real-World Usage

Dot products and norms power critical systems:

  • Semantic search: Query and document embeddings are compared using cosine similarity. Higher similarity means more relevant results.
  • Recommendation engines: Netflix and Spotify represent users and items as vectors. Cosine similarity finds "people like you" and "items like this."
  • Word embeddings: Word2Vec and GloVe encode words as vectors where cosine similarity captures meaning. "king" - "man" + "woman" is close to "queen."
  • Image retrieval: Photos are encoded as vectors. Finding similar images is a nearest-neighbor search by cosine distance.
  • RAG (Retrieval-Augmented Generation): LLMs use vector similarity to find relevant context before generating answers.

Connections