Serverless infrastructure that builds AWS Lambda Python layers on-demand using EC2 Spot instances and Docker, with a GitHub Pages frontend.
┌──────────────────────────────────────────────────────────────────────┐
│ GitHub Pages (docs/index.html) │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ requirements.txt │ Python version │ Architecture │ Submit │ │
│ └─────────────────────────┬──────────────────────────────────┘ │
└────────────────────────────┼─────────────────────────────────────────┘
│ POST /builds
▼
┌──────────────────────────────────────────────────────────────────────┐
│ API Gateway (HTTP API) │
│ POST /builds → submit_build Lambda │
│ GET /builds/{id} → check_status Lambda │
└───────────┬──────────────────────────────────────┬───────────────────┘
│ │
▼ ▼
┌───────────────────┐ ┌───────────────────────┐
│ submit_build λ │ │ check_status λ │
│ • Validates input│ │ • Reads DynamoDB │
│ • Creates record │ │ • Generates presigned│
│ • Sends to SQS │ │ S3 download URLs │
└─────────┬─────────┘ └───────────┬───────────┘
│ │
▼ ▼
┌───────────────────┐ ┌───────────────────────┐
│ SQS Build Queue │ │ DynamoDB │
│ (with DLQ) │ │ buildId | status │
└─────────┬─────────┘ │ s3_keys | TTL │
│ └───────────────────────┘
▼ ▲
┌───────────────────┐ │
│ process_build λ │ │
│ • Launches EC2 │ │
│ Spot instance │ │
└─────────┬─────────┘ │
│ │
▼ │
┌──────────────────────────────────────────────────┼───────────────────┐
│ EC2 Spot Instance │ │
│ ┌─────────────────────────────────┐ │ │
│ │ 1. Install Docker │ │ │
│ │ 2. Pull/build Docker image │ │ │
│ │ 3. Run container to build │ │ │
│ │ Lambda layer zip files │ │ │
│ │ 4. Upload zips to S3 ─────────┼──┐ │ │
│ │ 5. Update DynamoDB status ─────┼──┼──────────┘ │
│ │ 6. Self-terminate │ │ │
│ └─────────────────────────────────┘ │ │
└───────────────────────────────────────┼──────────────────────────────┘
│
▼
┌───────────────────┐
│ S3 Artifacts │
│ builds/{id}/*.zip │
│ Lifecycle: 24h │
└───────────────────┘
- User opens GitHub Pages, enters
requirements.txt, selects Python version & architecture - API Gateway routes
POST /buildstosubmit_buildLambda - submit_build validates input, creates DynamoDB record (QUEUED), sends SQS message
- SQS triggers
process_buildLambda - process_build launches an EC2 Spot instance with a user-data script
- EC2 instance installs Docker, pulls pre-built images from GHCR (or builds from Dockerfile), runs the build, uploads zips to S3, updates DynamoDB (COMPLETED), self-terminates
- User frontend polls
GET /builds/{id}which returns status + presigned S3 download URLs - Artifacts auto-expire from S3 after configurable TTL (default 24h)
| Component | Cost | Notes |
|---|---|---|
| EC2 Spot (c5.xlarge) | ~$0.04/hr | ~$0.01 per build (15 min avg) |
| S3 | ~$0.023/GB/month | Artifacts auto-expire |
| Lambda | ~$0.20/1M requests | Minimal usage |
| API Gateway | $1.00/1M requests | HTTP API pricing |
| DynamoDB | Pay-per-request | ~$0.00 for low volume |
| SQS | $0.40/1M messages | Negligible |
| Total (idle) | ~$0/month | No running infrastructure |
| Per build | ~$0.01-0.03 | Spot instance + S3 |
- AWS account with permissions to create VPC, EC2, Lambda, S3, SQS, DynamoDB, API Gateway, IAM
- Terraform >= 1.5.0
- AWS CLI configured (
aws configure)
cd infrastructure/terraform
# Copy and customize configuration
cp terraform.tfvars.example terraform.tfvars
# Edit terraform.tfvars with your preferences
# Initialize and deploy
terraform init
terraform plan
terraform applyAfter deployment, note the api_url output:
Outputs:
api_url = "https://xxxxxxxxxx.execute-api.eu-central-1.amazonaws.com"
- In your GitHub repository: Settings → Pages → Source: Deploy from a branch
- Select Branch: main, Folder: /docs
- Open your GitHub Pages URL
- Click ⚙ API Settings and paste the
api_urlfrom Terraform output - Start building layers!
| Variable | Default | Description |
|---|---|---|
aws_region |
eu-central-1 |
AWS region |
environment |
prod |
Environment name |
artifact_ttl_hours |
24 |
Hours to keep artifacts in S3 |
ec2_instance_type |
c5.xlarge |
Spot instance type |
ec2_volume_size |
50 |
EBS volume size (GB) |
ec2_max_build_time_minutes |
30 |
Safety timeout per build |
allowed_origins |
["*"] |
CORS origins |
docker_image_prefix |
ghcr.io/fok666/lambda-python-layer |
Pre-built image registry |
Submit a new build request.
{
"requirements": "numpy==1.26.4\nrequests==2.32.4",
"python_version": "3.13",
"architectures": ["x86_64", "arm64"],
"single_file": true
}Response:
{
"build_id": "a1b2c3d4-...",
"status": "QUEUED",
"expires_at": 1709398800
}Check build status. Returns presigned download URLs when completed.
Response (completed):
{
"build_id": "a1b2c3d4-...",
"status": "COMPLETED",
"python_version": "3.13",
"architectures": ["x86_64", "arm64"],
"files": [
{
"filename": "combined-python3.13-x86_64.zip",
"download_url": "https://s3.amazonaws.com/...",
"architecture": "x86_64"
},
{
"filename": "combined-python3.13-aarch64.zip",
"download_url": "https://s3.amazonaws.com/...",
"architecture": "arm64"
}
]
}- S3 bucket: Private, no public access. Downloads via presigned URLs only
- EC2 instances: No SSH, no inbound ports. Egress-only security group
- IMDSv2: Enforced on all EC2 instances
- EBS encryption: Enabled by default
- IAM: Least-privilege policies per component
- DynamoDB TTL: Automatic cleanup of old records
- S3 lifecycle: Automatic deletion of old artifacts
cd infrastructure/terraform
terraform destroyNote: S3 bucket must be empty before destruction. Terraform will fail if artifacts exist. Wait for lifecycle expiration or manually empty the bucket.