Why This Matters
Managing servers is work. You have to provision them, patch the operating system, monitor disk space, handle scaling, and pay for idle capacity. What if you could skip all of that and just write a function that runs when someone calls it?
That is the serverless model. You write a function, deploy it, and the cloud provider handles everything else: provisioning, scaling, and billing. Your function runs only when triggered (an HTTP request, a cron schedule, a database event) and you pay only for the milliseconds of execution time. AWS Lambda, Vercel Functions, and Cloudflare Workers all use this model. Serverless does not mean there are no servers -- it means you never have to think about them.
Define Terms
Visual Model
The full process at a glance. Click Start tour to walk through each step.
Serverless lifecycle: trigger, optional cold start, execute, respond, then idle or shut down.
Code Example
// Vercel Serverless Function (Next.js API route)
// File: app/api/hello/route.js
export async function GET(request) {
const url = new URL(request.url);
const name = url.searchParams.get("name") || "World";
return Response.json({
message: `Hello, ${name}!`,
timestamp: new Date().toISOString(),
region: process.env.VERCEL_REGION || "local"
});
}
// AWS Lambda handler pattern
// exports.handler = async (event) => {
// const name = event.queryStringParameters?.name || "World";
// return {
// statusCode: 200,
// body: JSON.stringify({ message: `Hello, ${name}!` })
// };
// };
// Cloudflare Worker (runs at the edge)
// export default {
// async fetch(request) {
// return new Response("Hello from the edge!", {
// headers: { "Content-Type": "text/plain" }
// });
// }
// };
console.log("Each platform has its own function signature");
console.log("But they all follow the same pattern:");
console.log("Receive event -> process -> return response");Interactive Experiment
Try these exercises:
- Create a Next.js API route in
app/api/hello/route.jsthat returns JSON. This is a serverless function when deployed to Vercel. - Deploy a simple function to Vercel or Cloudflare Workers. Hit it from your browser and check the response time.
- Make two requests in quick succession. The first may be slow (cold start), the second should be fast (warm instance). Measure the difference.
- Try returning different responses based on the request method (GET vs POST). How does the function signature handle this?
Quick Quiz
Coding Challenge
Write a function called `calculateCost` that estimates serverless costs. It takes three arguments: `invocations` (number of function calls per month), `avgDurationMs` (average execution time in milliseconds), and `memorySizeMB` (allocated memory in MB, default 128). The cost formula is: GB-seconds = invocations * (avgDurationMs / 1000) * (memorySizeMB / 1024). Price is $0.0000166667 per GB-second. The first 400,000 GB-seconds per month are free. Return the monthly cost rounded to 2 decimal places as a number.
Real-World Usage
Serverless is used widely across the industry:
- Vercel deploys Next.js API routes as serverless functions. Every
app/api/file becomes a function that scales automatically. - AWS Lambda processes billions of events daily: API requests, S3 file uploads, DynamoDB stream events, and scheduled cron jobs.
- Cloudflare Workers run at the edge (closest data center to the user) with sub-millisecond cold starts, ideal for low-latency APIs.
- Webhook handlers for services like Stripe, GitHub, and Twilio are a perfect serverless use case: sporadic traffic, stateless processing.
- Image processing pipelines use serverless to resize images on upload, paying only for actual processing time instead of keeping a server running.