AWS S3 LocalStack
This example demonstrates how to copy local files to LocalStack, which is a local AWS cloud service emulator that mimics S3 functionality for development and testing purposes.
What LocalStack does
LocalStack allows you to run AWS services locally in Docker containers, eliminating the need to connect to actual AWS infrastructure during development. This is particularly useful for testing S3 operations without incurring costs.
Setup
Bring up the containers
docker compose up
The localstack container
docker ps
...
data-workspace-frontend-data-workspace-localstack
...
Set local environment variables
There’s no need to set an AWS profile or an AWS region when running localstack, it’s simpler to set 2 env vars.
export AWS_ACCESS_KEY_ID=test
export AWS_SECRET_ACCESS_KEY=test
Configuration
Files React Configuration
Open files-react.html and print the YOURFILES_CONFIG
object to the browser console.
{
"region": "aws-region",
"rootUrl": "/files/",
"bucketName": "notebooks.dataworkspace.local",
"teamsPrefix": "teams/",
"rootPrefix": "user/federated/4f54003249593a7eccd6e9aa732d14b9eb53d8f57829020da8b2cdad6f35bbc2/",
"initialPrefix": "user/federated/4f54003249593a7eccd6e9aa732d14b9eb53d8f57829020da8b2cdad6f35bbc2/",
"bigdataPrefix": "bigdata/",
"credentialsUrl": "\"/api/v1/aws_credentials",
"endpointUrl": "http://data-workspace-localstack:4566",
"createTableUrl": "/files/create-table/confirm",
"teamsPrefixes": [],
"s3Path": ""
}
List buckets
aws --endpoint-url=http://localhost:4566 s3 ls
2025-09-05 16:06:33 notebooks.dataworkspace.local
2025-09-05 16:06:36 uploads.dataworkspace.local
Endpoint URLs Explained
From host machine (CLI): localhost:4566
From Docker containers: data-workspace-localstack:4566
The Docker port mapping 4563-4599:4563-4599
(see data-workspace-localstack
in docker-compose.yml) makes LocalStack accessible via localhost
from your host, while containers use the container name to communicate within the Docker network.
Working with Files
Create test files to upload
echo 'root' > root.txt && \
echo 'team' > team.txt && \
echo 'bigdata' > bigdata.txt && \
echo 'mydataset' > mydataset.txt
Full S3 URL pattern
s3://{bucketName}/{rootPrefix}{additional-path}
Copy a local file to an S3 bucket in localstack
The following commands use the rootPrefix
from your config. Replace {user-federated-id}
with your actual federated user ID:
4f54003249593a7eccd6e9aa732d14b9eb53d8f57829020da8b2cdad6f35bbc2
Copy root.txt
to /root
aws --endpoint-url=http://localhost:4566 s3 cp root.txt s3://notebooks.dataworkspace.local/user/federated/{user-federated-id}/
Copy team.txt
to /team
aws --endpoint-url=http://localhost:4566 s3 cp team.txt s3://notebooks.dataworkspace.local/user/federated/{user-federated-id}/team/
Copy bigdata.txt
to bigdata/
aws --endpoint-url=http://localhost:4566 s3 cp bigdata.txt s3://notebooks.dataworkspace.local/user/federated/{user-federated-id}/bigdata/
Copy mydataset.txt
to /bigdata/datasets/
aws --endpoint-url=http://localhost:4566 s3 cp mydataset.txt s3://notebooks.dataworkspace.local/user/federated/{user-federated-id}/bigdata/datasets/
View test files
aws --endpoint-url=http://localhost:4566 s3 ls s3://notebooks.dataworkspace.local --recursive
2025-09-06 17:05:55 8 user/federated/{user-federated-id}/bigdata/bigdata.txt
2025-09-06 17:06:03 10 user/federated/{user-federated-id}/bigdata/datasets/mydataset.txt
2025-09-06 16:59:27 5 user/federated/{user-federated-id}/root.txt
2025-09-06 17:05:43 5 user/federated/{user-federated-id}/team/team.txt
Cleanup
Unset local envars
unset AWS_ACCESS_KEY_ID
unset AWS_SECRET_ACCESS_KEY