machine learning25 min

Feature Engineering

Transforming raw data into meaningful inputs that help models learn effectively

0/9Not Started

Why This Matters

The most powerful algorithm in the world cannot save you from bad input data. In machine learning, the quality of your features — the input variables your model sees — often matters more than the choice of algorithm. This is the principle of "garbage in, garbage out."

Feature engineering is the art and science of transforming raw data into a format that makes it easy for models to learn patterns. A skilled feature engineer can take the same dataset and the same algorithm and dramatically improve results by choosing and transforming features wisely.

Define Terms

Visual Model

Raw DataMessy input
CleanHandle missing
EncodeCategories to nums
ScaleNormalize
SelectDrop irrelevant
Model Input

The full process at a glance. Click Start tour to walk through each step.

Feature engineering pipeline: clean, encode, scale, select, then feed clean features to the model.

Code Example

Code
// Feature engineering utilities

// Min-Max Normalization: scales values to [0, 1]
function normalize(values) {
  const min = Math.min(...values);
  const max = Math.max(...values);
  if (max === min) return values.map(() => 0);
  return values.map(v => (v - min) / (max - min));
}

// Standardization: mean = 0, std = 1
function standardize(values) {
  const mean = values.reduce((s, v) => s + v, 0) / values.length;
  const std = Math.sqrt(
    values.reduce((s, v) => s + (v - mean) ** 2, 0) / values.length
  );
  if (std === 0) return values.map(() => 0);
  return values.map(v => +((v - mean) / std).toFixed(3));
}

// One-Hot Encoding
function oneHotEncode(categories) {
  const unique = [...new Set(categories)];
  return categories.map(cat =>
    unique.reduce((obj, u) => ({ ...obj, [u]: cat === u ? 1 : 0 }), {})
  );
}

// Examples
const ages = [25, 30, 45, 22, 60];
console.log("Normalized:", normalize(ages));
console.log("Standardized:", standardize(ages));

const colors = ["red", "blue", "red", "green", "blue"];
console.log("One-hot:", oneHotEncode(colors));

Interactive Experiment

Try these exercises to understand feature engineering:

  • Normalize the list [100, 200, 300, 400, 500]. What do the min and max become?
  • Standardize the same list. What is the mean and standard deviation of the result?
  • One-hot encode ["small", "medium", "large", "medium"]. How many new columns are created?
  • What would happen if you had a category with 1000 unique values? Is one-hot encoding still practical?
  • Try feeding unnormalized features to a distance-based algorithm (like k-NN). Then normalize and compare results.

Quick Quiz

Coding Challenge

Feature Pipeline

Write a function called `prepareFeatures` that takes an array of objects with `age` (number), `income` (number), and `city` (string) fields. It should return an array of feature arrays where: age and income are normalized to [0,1], and city is one-hot encoded. Each output array should contain [normalizedAge, normalizedIncome, isCity1, isCity2, ...].

Loading editor...

Real-World Usage

Feature engineering is where domain expertise meets machine learning:

  • Natural language processing: Text is transformed into features using TF-IDF, word embeddings, or tokenization before models can process it.
  • Recommendation systems: User behavior is encoded as features: time since last purchase, number of views, category preferences.
  • Financial modeling: Raw transaction data is transformed into features like rolling averages, ratios, and time deltas.
  • Computer vision: While deep learning learns features automatically, traditional CV uses hand-crafted features like edge histograms and color distributions.
  • Tabular data competitions: On Kaggle, feature engineering is often the biggest differentiator between winning and losing solutions.

Connections