Event Sources & Triggers (AWS & GCP)
What Are Event Sources?
Event sources are services that trigger your serverless functions. Both AWS and GCP support similar concepts with different implementations.
Simple Explanation
What it is
Event sources are the things that wake your functions up: file uploads, database changes, HTTP requests, or scheduled timers.
Why we need it
Serverless does not run all the time. Events tell the platform exactly when your code should execute.
Benefits
- Automatic triggers without manual scheduling.
- Easy integration across storage, messaging, and databases.
- Scales naturally with the volume of events.
Tradeoffs
- Multiple event formats across services and providers.
- Retries and duplicates require careful handling.
Real-world examples (architecture only)
- File uploaded → Function → Resize and store.
- Database record created → Function → Send notification.
File upload (S3 or Cloud Storage)
↓ (event triggers)
Lambda or Cloud Function processes
↓
Saves result
Part 1: AWS Event Sources
AWS Event Sources
| Source | Trigger Event | Use Case |
|---|---|---|
| S3 | File upload, delete | Image processing |
| API Gateway | HTTP request | REST API |
| DynamoDB Streams | Item changed | Real-time sync |
| SNS | Message published | Broadcasting |
| SQS | Message queued | Async processing |
| CloudWatch Events | Time event | Scheduled tasks |
| EventBridge | Custom events | Event-driven workflows |
| Kinesis | Stream data | Real-time data processing |
S3 Trigger Example
Setup in Console:
- S3 Bucket → Properties → Event notifications
- Event types: s3:ObjectCreated:*
- Destination: Lambda ARN
Event Structure:
{
"Records": [{
"s3": {
"bucket": { "name": "my-bucket" },
"object": { "key": "uploads/image.jpg" }
}
}]
}
Lambda Handler:
import boto3
s3 = boto3.client("s3")
def handler(event, context):
for record in event.get("Records", []):
bucket = record["s3"]["bucket"]["name"]
key = record["s3"]["object"]["key"]
print(f"Processing {key} from {bucket}")
obj = s3.get_object(Bucket=bucket, Key=key)
# Process (e.g., create thumbnail)
# processed_image = ...
s3.put_object(
Bucket=bucket,
Key=f"thumbnails/{key}",
Body=processed_image,
)
return {"statusCode": 200}
Part 2: Google Cloud Event Sources
Event Sources in GCP
| Source | Trigger Event | Use Case |
|---|---|---|
| Cloud Storage | File upload, delete | Image processing |
| Cloud Pub/Sub | Message published | Broadcasting |
| Cloud Firestore | Document changed | Real-time sync |
| Cloud Tasks | Task queued | Async processing |
| Cloud Scheduler | Time event | Scheduled tasks |
| Cloud Events | Custom events | Event-driven workflows |
| Bigtable | Row mutation | Stream processing |
Cloud Storage Trigger Example
Setup with gcloud:
gcloud functions deploy processImage \
--runtime python312 \
--trigger-resource my-bucket \
--trigger-event google.storage.object.finalize
Event Structure:
{
"name": "uploads/image.jpg",
"bucket": "my-bucket",
"contentType": "image/jpeg",
"size": "102400"
}
Cloud Function (Python):
from google.cloud import storage
storage_client = storage.Client()
def process_image(file, context):
bucket = storage_client.bucket(file["bucket"])
source_file = bucket.blob(file["name"])
print(f"Processing {file['name']} from {file['bucket']}")
data = source_file.download_as_bytes()
# Process (e.g., create thumbnail)
# processed_image = ...
dest_file = bucket.blob(f"thumbnails/{file['name']}")
dest_file.upload_from_string(processed_image)
print("Upload complete")
Pub/Sub Trigger Example
Publish Message:
gcloud pubsub topics publish my-topic --message "Hello"
Cloud Function:
import base64
def process_pubsub(event, context):
pubsub_message = base64.b64decode(event["data"]).decode("utf-8")
print(f"Received message: {pubsub_message}")
return {"statusCode": 200}
AWS vs. GCP Event Comparison
| Aspect | AWS | GCP |
|---|---|---|
| Object Storage | S3 events | Cloud Storage events |
| Pub/Sub | SNS, SQS | Cloud Pub/Sub |
| Database Streams | DynamoDB Streams | Firestore, Datastore |
| Scheduled Tasks | EventBridge, CloudWatch | Cloud Scheduler |
| Event Format | Service-specific JSON | CloudEvents standard |
| Retry Behavior | Configurable | Configurable (see GCP docs) |
| Dead Letter | SQS DLQ | Pub/Sub dead-letter topics |
| Filtering | Event Filtering rules | Attribute filtering |
Event Filtering & Error Handling
AWS DLQ Setup
# Function timeout or error occurs
# Configured DLQ captures failed event automatically
def dlq_handler(event, context):
for message in event.get("Records", []):
print(f"Failed event: {message.get('body')}")
GCP Retry & Dead Letter
import base64
def dead_letter_handler(event, context):
data = base64.b64decode(event["data"]).decode("utf-8")
print(f"Failed event: {data}")
Event Mapping (Handle Multiple Sources)
- Click Save
CloudWatch Scheduled Trigger
- Go to EventBridge Console
- Click Create rule
- Schedule expression:
cron(0 9 * * ? *)(daily at 9 AM) - Target: Lambda function
- Click Create
DynamoDB Streams Trigger
- Go to DynamoDB Console
- Select table
- Go to Exports and streams
- Enable DynamoDB Streams
- Go to Lambda > Add trigger
- Select DynamoDB
- Click Add
Handling Different Events
S3 Trigger Example
def handler(event, context):
record = event.get("Records", [])[0]
bucket = record["s3"]["bucket"]["name"]
key = record["s3"]["object"]["key"]
print(f"Processing {key} from {bucket}")
return {"statusCode": 200, "message": "Processed"}
Scheduled Event (CloudWatch)
def handler(event, context):
print("Scheduled event fired")
return {"statusCode": 200}
DynamoDB Stream Example
def handler(event, context):
for record in event.get("Records", []):
if record.get("eventName") == "INSERT":
new_item = record["dynamodb"]["NewImage"]
print("New item:", new_item)
Event Filtering
Reduce Lambda invocations by filtering events:
S3 Filter Example
In AWS Console:
- Event:
s3:ObjectCreated:* - Filter rules:
- Prefix:
uploads/ - Suffix:
.jpg
- Prefix:
Only JPG files in uploads/ trigger Lambda.
Error Handling in Event-Driven Systems
def handler(event, context):
try:
process_image(bucket, key)
except Exception as exc:
print(f"Error processing: {exc}")
raise exc
Dead Letter Queues (DLQ)
If Lambda fails repeatedly, send to DLQ:
- Go to Lambda function
- Go to Configuration → Asynchronous invocation
- Set Failure destination to SQS queue
- Failed events pile up for investigation
AWS Event Routing
def handler(event, context):
records = event.get("Records", [])
if records and records[0].get("s3"):
return handle_s3(event)
if records and records[0].get("dynamodb"):
return handle_dynamodb(event)
if event.get("httpMethod") == "POST":
return handle_api(event)
GCP Cloud Function Routing
def http_function(request):
if request.method == "POST":
return handle_http(request)
def pubsub_function(event, context):
return handle_pubsub(event)
def storage_function(event, context):
return handle_storage(event)
Best Practices
- Keep event handlers focused — One responsibility per function
- Use DLQs — Catch failed async events
- Filter events early — Reduce Lambda invocations
- Idempotent processing — Handle duplicate events gracefully
- Log event details — Debug easier with full event context
Hands-On: Image Thumbnail Generator
Build a Lambda that:
- Triggers on S3 image upload
- Resizes image to thumbnail
- Saves thumbnail back to S3
- Updates DynamoDB with metadata
Steps:
- Create S3 bucket
- Create DynamoDB table for metadata
- Write Lambda handler
- Connect S3 trigger
- Test with image upload
Key Takeaway
Event sources decouple your system. Services communicate through events, not direct calls. This is the foundation of serverless architecture.
Project (Cloud-Agnostic)
Build an event-driven flow where a file upload triggers processing and a notification.
Deliverables:
- Describe the event source and trigger.
- Map the storage and compute to AWS or GCP.
- Explain how failures are handled (retries, DLQ).
If you want feedback, email your write-up to [email protected].
References
- AWS S3 Events: https://docs.aws.amazon.com/AmazonS3/latest/userguide/EventNotifications.html
- AWS EventBridge: https://docs.aws.amazon.com/eventbridge/
- AWS Lambda Developer Guide: https://docs.aws.amazon.com/lambda/latest/dg/welcome.html
- Google Cloud Functions: https://docs.cloud.google.com/functions/docs
- Google Cloud Storage Events: https://cloud.google.com/storage/docs/pubsub-notifications
- Google Cloud Pub/Sub: https://cloud.google.com/pubsub/docs/overview