Developing with AWS comes with its own set of challenges. If your organization has strict policies on cloud resources, prototyping with the AWS services can become a hassle. LocalStack is a container-based technology which brings a comprehensive set of AWS services on your local machine. It plays well with the official AWS CLI and SDK. In this guide, I’ll talk about how to setup LocalStack and use it with the AWS CLI.
The examples in this post use
- Docker Engine 20.10.17
- AWS CLI 2.7.14
Configure a local AWS account
LocalStack works with a local AWS account which you can configure with the AWS CLI. Launch the aws configure
command as follows.
aws configure
AWS Access Key ID [None]: gwen
AWS Secret Access Key [None]: stacy
Default region name [None]: us-east-1
Default output format [None]: json
You can put fake AWS Access Key ID and AWS Secret Access Key here. Although LocalStack requires an AWS account configured, it doesn’t validate them.
Launching the LocalStack container
Pull the latest LocalStack image from Docker.
docker pull localstack/localstack:latest
Create a Compose file as follows.
version: '3'
services:
aws:
image: localstack/localstack:latest
environment:
DEBUG: 1
LAMBDA_DOCKER_NETWORK: my-local-aws-network
LAMBDA_REMOTE_DOCKER: 0
SERVICES: s3,sqs,secretsmanager
ports:
4566:4566
- volumes:
/var/run/docker.sock:/var/run/docker.sock
-
networks:
default:
name: my-local-aws-network
You can now launch the container with the following command.
docker compose -f localstack.yml up -d
Once the container is up and running, open a terminal and ping the healthcheck endpoint. If things are working, you would see the status of the available services as “available”.
curl localhost:4566/health
{"features": {
"initScripts": "initialized"
},"services": {
"apigateway": "available",
"cloudformation": "available",
"cloudwatch": "available",
"config": "available",
"dynamodb": "available",
"dynamodbstreams": "available",
"ec2": "available",
"es": "available",
"events": "available",
"firehose": "available",
"iam": "available",
"kinesis": "available",
"kms": "available",
"lambda": "available",
"logs": "available",
"opensearch": "available",
"redshift": "available",
"resource-groups": "available",
"resourcegroupstaggingapi": "available",
"route53": "available",
"route53resolver": "available",
"s3": "available",
"s3control": "available",
"secretsmanager": "available",
"ses": "available",
"sns": "available",
"sqs": "available",
"ssm": "available",
"stepfunctions": "available",
"sts": "available",
"support": "available",
"swf": "available"
},"version": "1.0.0.dev"
}
Working with AWS services
You can now use the AWS services (such as S3, SNS, SQS, Secrets Manager, etc) through the port 4566. You can find the list of the core AWS services available on LocalStack here. Let’s explore some services with AWS CLI now.
Saving objects on S3
Create a sample JSON file as follows, to save it on S3.
{"name": "Madame Uppercut",
"age": 39,
"secretIdentity": "Jane Wilson",
"powers": [
"Million tonne punch",
"Damage resistance",
"Superhuman reflexes"
]
}
Let’s create a bucket, say my-bucket
, as follows.
aws --endpoint-url http://localhost:4566 s3api create-bucket --bucket my-bucket --region us-east-1
{"Location": "/my-bucket"
}
You can list all the buckets with the following command.
aws --endpoint-url http://localhost:4566 s3api list-buckets
{"Buckets": [
{"Name": "my-bucket",
"CreationDate": "2022-07-12T13:44:44+00:00"
}
],"Owner": {
"DisplayName": "webfile",
"ID": "bcaf1ffd86f41161ca5fb16fd081034f"
}
}
Now, you can upload the sample.json
file on the new bucket.
'application/json'
aws --endpoint-url http://localhost:4566 s3 cp sample.json s3://my-bucket/inner/sample.json --content-type \sample.json to s3://my-bucket/inner/sample.json
upload: .
You can download the existing file from the S3 bucket as follows.
'application/json'
aws --endpoint-url http://localhost:4566 s3 cp s3://my-bucket/inner/sample.json sample2.json --content-type \sample2.json
download: s3://my-bucket/inner/sample.json to .
To delete the file, you can use the following command.
aws --endpoint-url http://localhost:4566 s3 rm s3://my-bucket/inner/sample.json
delete: s3://my-bucket/inner/sample.json
Finally, you can delete the bucket as follows.
aws --endpoint-url http://localhost:4566 s3api delete-bucket --bucket my-bucket
Refer to the s3 and s3api docs for more operations to try with LocalStack.
Publishing events with SQS
You can use the following command to create a queue called my-queue
.
aws --endpoint-url http://localhost:4566 sqs create-queue --queue-name my-queue
{"QueueUrl": "http://localhost:4566/000000000000/my-queue"
}
To verify if the queue is available, list all the queues as follows.
aws --endpoint-url http://localhost:4566 sqs list-queues
{"QueueUrls": [
"http://localhost:4566/000000000000/my-queue"
]
}
Let’s publish a message using the send-message
command.
"Gwen"
aws --endpoint-url http://localhost:4566 sqs send-message --queue-url http://localhost:4566/000000000000/my-queue --message-body
{"MD5OfMessageBody": "030997f386c4663f2c3e9594308c60b4",
"MessageId": "8c6257d2-84c8-4689-a6a1-1a37b1faa3ec"
}
You can read the published messages through the receive-message
command.
aws --endpoint-url http://localhost:4566 sqs receive-message --queue-url http://localhost:4566/000000000000/my-queue
{"Messages": [
{"MessageId": "8c6257d2-84c8-4689-a6a1-1a37b1faa3ec",
"ReceiptHandle": "ZDYzMmRjMmUtNWY2Yi00NzRmLWI1ZjQtYTYwNGJiZGRkMGFjIGFybjphd3M6c3FzOnVzLWVhc3QtMTowMDAwMDAwMDAwMDA6bXktcXVldWUgOGM2MjU3ZDItODRjOC00Njg5LWE2YTEtMWEzN2IxZmFhM2VjIDE2NTc2MzQwMDIuNzE3MDIyNA==",
"MD5OfBody": "030997f386c4663f2c3e9594308c60b4",
"Body": "Gwen"
}
]
}
Finally, to delete a message, you can use the delete-message
command as follows. To delete the queue, use the delete-queue
command.
aws --endpoint-url http://localhost:4566 sqs delete-message --queue-url http://localhost:4566/000000000000/my-queue --receipt-handle ZDYzMmRjMmUtNWY2Yi00NzRmLWI1ZjQtYTYwNGJiZGRkMGFjIGFybjphd3M6c3FzOnVzLWVhc3QtMTowMDAwMDAwMDAwMDA6bXktcXVldWUgOGM2MjU3ZDItODRjOC00Njg5LWE2YTEtMWEzN2IxZmFhM2VjIDE2NTc2MzQwMDIuNzE3MDIyNA==
aws --endpoint-url http://localhost:4566 sqs delete-queue --queue-url http://localhost:4566/000000000000/my-queue
For more operations, check the sqs docs.
Creating secrets with SecretsManager
To create a secret, you can use the create-secret
command as follows.
'{"PG_PASSWORD":"stacy"}'
aws --endpoint-url http://localhost:4566 secretsmanager create-secret --name my-secret --secret-string
{"ARN": "arn:aws:secretsmanager:us-east-1:000000000000:secret:my-secret-b3dd81",
"Name": "my-secret",
"VersionId": "33395f3b-6f75-4c48-9424-33c730538063"
}
You can also list all the secrets available on the Secrets Manager.
aws --endpoint-url http://localhost:4566 secretsmanager list-secrets
{"SecretList": [
{"ARN": "arn:aws:secretsmanager:us-east-1:000000000000:secret:my-secret-b3dd81",
"Name": "my-secret",
"LastChangedDate": "2022-07-12T19:27:05.440032+05:30",
"Tags": [],
"SecretVersionsToStages": {
"33395f3b-6f75-4c48-9424-33c730538063": [
"AWSCURRENT"
]
},"CreatedDate": "2022-07-12T19:27:05.440032+05:30"
}
]
}
You can read the secrets with the get-secret-value
command
aws --endpoint-url http://localhost:4566 secretsmanager get-secret-value --secret-id my-secret
{"ARN": "arn:aws:secretsmanager:us-east-1:000000000000:secret:my-secret-b3dd81",
"Name": "my-secret",
"VersionId": "33395f3b-6f75-4c48-9424-33c730538063",
"SecretString": "{\"PG_PASSWORD\":\"stacy\"}",
"VersionStages": [
"AWSCURRENT"
],"CreatedDate": "2022-07-12T19:27:05.440032+05:30"
}
Finally, you can delete a secret with its ARN.
aws --endpoint-url http://localhost:4566 secretsmanager delete-secret --secret-id arn:aws:secretsmanager:us-east-1:000000000000:secret:my-secret-b3dd81
{"ARN": "arn:aws:secretsmanager:us-east-1:000000000000:secret:my-secret-b3dd81",
"Name": "my-secret",
"DeletionDate": "2022-08-11T19:29:39.904093+05:30"
}
For more operations, check out the secretsmanager docs.
Conclusion
- LocalStack is geared toward CLI and SDK driven workflows. If you need a desktop app, you can check out Commandeer, or LocalStack subscriptions which offer a Web UI.
- Support for some AWS services (such as ElastiCache, ECS, EKS, etc) requires a subscription.
Source code
Related