Skip to main content

Event Sources & Triggers (AWS & GCP)

What Are Event Sources?

Event sources are services that trigger your serverless functions. Both AWS and GCP support similar concepts with different implementations.


Simple Explanation

What it is

Event sources are the things that wake your functions up: file uploads, database changes, HTTP requests, or scheduled timers.

Why we need it

Serverless does not run all the time. Events tell the platform exactly when your code should execute.

Benefits

  • Automatic triggers without manual scheduling.
  • Easy integration across storage, messaging, and databases.
  • Scales naturally with the volume of events.

Tradeoffs

  • Multiple event formats across services and providers.
  • Retries and duplicates require careful handling.

Real-world examples (architecture only)

  • File uploaded → Function → Resize and store.
  • Database record created → Function → Send notification.
File upload (S3 or Cloud Storage)
↓ (event triggers)
Lambda or Cloud Function processes

Saves result

Part 1: AWS Event Sources

AWS Event Sources

SourceTrigger EventUse Case
S3File upload, deleteImage processing
API GatewayHTTP requestREST API
DynamoDB StreamsItem changedReal-time sync
SNSMessage publishedBroadcasting
SQSMessage queuedAsync processing
CloudWatch EventsTime eventScheduled tasks
EventBridgeCustom eventsEvent-driven workflows
KinesisStream dataReal-time data processing

S3 Trigger Example

Setup in Console:

  1. S3 Bucket → Properties → Event notifications
  2. Event types: s3:ObjectCreated:*
  3. Destination: Lambda ARN

Event Structure:

{
"Records": [{
"s3": {
"bucket": { "name": "my-bucket" },
"object": { "key": "uploads/image.jpg" }
}
}]
}

Lambda Handler:

import boto3

s3 = boto3.client("s3")

def handler(event, context):
for record in event.get("Records", []):
bucket = record["s3"]["bucket"]["name"]
key = record["s3"]["object"]["key"]

print(f"Processing {key} from {bucket}")

obj = s3.get_object(Bucket=bucket, Key=key)

# Process (e.g., create thumbnail)
# processed_image = ...

s3.put_object(
Bucket=bucket,
Key=f"thumbnails/{key}",
Body=processed_image,
)

return {"statusCode": 200}

Part 2: Google Cloud Event Sources

Event Sources in GCP

SourceTrigger EventUse Case
Cloud StorageFile upload, deleteImage processing
Cloud Pub/SubMessage publishedBroadcasting
Cloud FirestoreDocument changedReal-time sync
Cloud TasksTask queuedAsync processing
Cloud SchedulerTime eventScheduled tasks
Cloud EventsCustom eventsEvent-driven workflows
BigtableRow mutationStream processing

Cloud Storage Trigger Example

Setup with gcloud:

gcloud functions deploy processImage \
--runtime python312 \
--trigger-resource my-bucket \
--trigger-event google.storage.object.finalize

Event Structure:

{
"name": "uploads/image.jpg",
"bucket": "my-bucket",
"contentType": "image/jpeg",
"size": "102400"
}

Cloud Function (Python):

from google.cloud import storage

storage_client = storage.Client()

def process_image(file, context):
bucket = storage_client.bucket(file["bucket"])
source_file = bucket.blob(file["name"])

print(f"Processing {file['name']} from {file['bucket']}")

data = source_file.download_as_bytes()

# Process (e.g., create thumbnail)
# processed_image = ...

dest_file = bucket.blob(f"thumbnails/{file['name']}")
dest_file.upload_from_string(processed_image)

print("Upload complete")

Pub/Sub Trigger Example

Publish Message:

gcloud pubsub topics publish my-topic --message "Hello"

Cloud Function:

import base64

def process_pubsub(event, context):
pubsub_message = base64.b64decode(event["data"]).decode("utf-8")
print(f"Received message: {pubsub_message}")
return {"statusCode": 200}

AWS vs. GCP Event Comparison

AspectAWSGCP
Object StorageS3 eventsCloud Storage events
Pub/SubSNS, SQSCloud Pub/Sub
Database StreamsDynamoDB StreamsFirestore, Datastore
Scheduled TasksEventBridge, CloudWatchCloud Scheduler
Event FormatService-specific JSONCloudEvents standard
Retry BehaviorConfigurableConfigurable (see GCP docs)
Dead LetterSQS DLQPub/Sub dead-letter topics
FilteringEvent Filtering rulesAttribute filtering

Event Filtering & Error Handling

AWS DLQ Setup

# Function timeout or error occurs
# Configured DLQ captures failed event automatically

def dlq_handler(event, context):
for message in event.get("Records", []):
print(f"Failed event: {message.get('body')}")

GCP Retry & Dead Letter

import base64

def dead_letter_handler(event, context):
data = base64.b64decode(event["data"]).decode("utf-8")
print(f"Failed event: {data}")

Event Mapping (Handle Multiple Sources)

  1. Click Save

CloudWatch Scheduled Trigger

  1. Go to EventBridge Console
  2. Click Create rule
  3. Schedule expression: cron(0 9 * * ? *) (daily at 9 AM)
  4. Target: Lambda function
  5. Click Create

DynamoDB Streams Trigger

  1. Go to DynamoDB Console
  2. Select table
  3. Go to Exports and streams
  4. Enable DynamoDB Streams
  5. Go to Lambda > Add trigger
  6. Select DynamoDB
  7. Click Add

Handling Different Events

S3 Trigger Example

def handler(event, context):
record = event.get("Records", [])[0]
bucket = record["s3"]["bucket"]["name"]
key = record["s3"]["object"]["key"]

print(f"Processing {key} from {bucket}")
return {"statusCode": 200, "message": "Processed"}

Scheduled Event (CloudWatch)

def handler(event, context):
print("Scheduled event fired")
return {"statusCode": 200}

DynamoDB Stream Example

def handler(event, context):
for record in event.get("Records", []):
if record.get("eventName") == "INSERT":
new_item = record["dynamodb"]["NewImage"]
print("New item:", new_item)

Event Filtering

Reduce Lambda invocations by filtering events:

S3 Filter Example

In AWS Console:

  • Event: s3:ObjectCreated:*
  • Filter rules:
    • Prefix: uploads/
    • Suffix: .jpg

Only JPG files in uploads/ trigger Lambda.

Error Handling in Event-Driven Systems

def handler(event, context):
try:
process_image(bucket, key)
except Exception as exc:
print(f"Error processing: {exc}")
raise exc

Dead Letter Queues (DLQ)

If Lambda fails repeatedly, send to DLQ:

  1. Go to Lambda function
  2. Go to Configuration → Asynchronous invocation
  3. Set Failure destination to SQS queue
  4. Failed events pile up for investigation

AWS Event Routing

def handler(event, context):
records = event.get("Records", [])
if records and records[0].get("s3"):
return handle_s3(event)
if records and records[0].get("dynamodb"):
return handle_dynamodb(event)
if event.get("httpMethod") == "POST":
return handle_api(event)

GCP Cloud Function Routing

def http_function(request):
if request.method == "POST":
return handle_http(request)

def pubsub_function(event, context):
return handle_pubsub(event)

def storage_function(event, context):
return handle_storage(event)

Best Practices

  1. Keep event handlers focused — One responsibility per function
  2. Use DLQs — Catch failed async events
  3. Filter events early — Reduce Lambda invocations
  4. Idempotent processing — Handle duplicate events gracefully
  5. Log event details — Debug easier with full event context

Hands-On: Image Thumbnail Generator

Build a Lambda that:

  1. Triggers on S3 image upload
  2. Resizes image to thumbnail
  3. Saves thumbnail back to S3
  4. Updates DynamoDB with metadata

Steps:

  • Create S3 bucket
  • Create DynamoDB table for metadata
  • Write Lambda handler
  • Connect S3 trigger
  • Test with image upload

Key Takeaway

Event sources decouple your system. Services communicate through events, not direct calls. This is the foundation of serverless architecture.


Project (Cloud-Agnostic)

Build an event-driven flow where a file upload triggers processing and a notification.

Deliverables:

  1. Describe the event source and trigger.
  2. Map the storage and compute to AWS or GCP.
  3. Explain how failures are handled (retries, DLQ).

If you want feedback, email your write-up to [email protected].


References