Writing Serverless Functions
Serverless functions are small, focused pieces of code that run in response to events. They're fundamentally different from traditional applications — they start, do one thing, and stop. Understanding this model helps you write functions that work well in the serverless environment.
Function Structure
Every serverless function follows a similar pattern: receive an event, process it, return a response. Here's a typical AWS Lambda function in Python:
import json
def handler(event, context):
# event: the input data (HTTP request, queue message, etc.)
# context: runtime information (time remaining, memory, etc.)
user_id = event.get('user_id')
# Do your work
result = process_user(user_id)
return {
'statusCode': 200,
'body': json.dumps(result)
}
The event contains whatever triggered the function — an HTTP request body, a queue message, or file metadata. The context provides runtime information like how much time remains before timeout.
Best Practices for Serverless Functions
Keep functions small and focused. Each function should do one thing. If you're tempted to add multiple responsibilities, split into separate functions.
Minimize package size. Smaller deployment packages mean faster cold starts. Only include dependencies you actually need.
Handle errors gracefully. Unhandled exceptions cause function failures. Catch errors, log them, and return appropriate responses.
Use environment variables for configuration. Never hardcode API keys, database URLs, or other configuration. Environment variables keep secrets out of code and make functions portable across environments.
Don't store state in the function. Functions are ephemeral — they may run on different machines each time. Store state in databases, caches, or object storage.
Understanding Cold Starts
When a function hasn't run recently, the platform must initialize a new execution environment. This cold start adds latency — sometimes hundreds of milliseconds.
Mitigation strategies include:
- Keep packages small — less to load means faster starts
- Choose faster runtimes — Go and Rust start faster than Java
- Use provisioned concurrency — keeps instances warm (costs money)
- Accept it — for many use cases, occasional cold starts are fine
The Stateless Mindset
Traditional applications maintain state between requests. Serverless functions don't — each invocation is independent. This constraint actually simplifies things: you don't worry about memory leaks or state corruption between requests.