Loading...
Loading...
Compare original and translation side by side
Skill by ara.so — Daily 2026 Skills collection.
4566boto3技能由ara.so提供 — 2026年度技能合集。
4566boto3pip install ministack
ministackpip install ministack
ministackundefinedundefineddocker run -p 4566:4566 nahuelnucera/ministackdocker run -p 4566:4566 nahuelnucera/ministackgit clone https://github.com/Nahuel990/ministack
cd ministack
docker compose up -dgit clone https://github.com/Nahuel990/ministack
cd ministack
docker compose up -dcurl http://localhost:4566/_localstack/healthcurl http://localhost:4566/_localstack/health| Environment Variable | Default | Description |
|---|---|---|
| | Port to listen on |
| | Set to |
| 环境变量 | 默认值 | 描述 |
|---|---|---|
| | 服务监听端口 |
| | 设置为 |
undefinedundefinedundefinedundefinedchmod +x bin/awslocal
./bin/awslocal s3 ls
./bin/awslocal dynamodb list-tableschmod +x bin/awslocal
./bin/awslocal s3 ls
./bin/awslocal dynamodb list-tablesimport boto3
ENDPOINT = "http://localhost:4566"
def aws_client(service: str):
return boto3.client(
service,
endpoint_url=ENDPOINT,
aws_access_key_id="test",
aws_secret_access_key="test",
region_name="us-east-1",
)
def aws_resource(service: str):
return boto3.resource(
service,
endpoint_url=ENDPOINT,
aws_access_key_id="test",
aws_secret_access_key="test",
region_name="us-east-1",
)import boto3
ENDPOINT = "http://localhost:4566"
def aws_client(service: str):
return boto3.client(
service,
endpoint_url=ENDPOINT,
aws_access_key_id="test",
aws_secret_access_key="test",
region_name="us-east-1",
)
def aws_resource(service: str):
return boto3.resource(
service,
endpoint_url=ENDPOINT,
aws_access_key_id="test",
aws_secret_access_key="test",
region_name="us-east-1",
)s3 = aws_client("s3")s3 = aws_client("s3")undefinedundefinedsqs = aws_client("sqs")sqs = aws_client("sqs")undefinedundefinedimport json
ddb = aws_client("dynamodb")import json
ddb = aws_client("dynamodb")undefinedundefinedsns = aws_client("sns")
sqs = aws_client("sqs")
topic = sns.create_topic(Name="my-topic")
topic_arn = topic["TopicArn"]
queue = sqs.create_queue(QueueName="fan-queue")
queue_attrs = sqs.get_queue_attributes(
QueueUrl=queue["QueueUrl"], AttributeNames=["QueueArn"]
)
queue_arn = queue_attrs["Attributes"]["QueueArn"]
sns.subscribe(TopicArn=topic_arn, Protocol="sqs", Endpoint=queue_arn)sns = aws_client("sns")
sqs = aws_client("sqs")
topic = sns.create_topic(Name="my-topic")
topic_arn = topic["TopicArn"]
queue = sqs.create_queue(QueueName="fan-queue")
queue_attrs = sqs.get_queue_attributes(
QueueUrl=queue["QueueUrl"], AttributeNames=["QueueArn"]
)
queue_arn = queue_attrs["Attributes"]["QueueArn"]
sns.subscribe(TopicArn=topic_arn, Protocol="sqs", Endpoint=queue_arn)undefinedundefinedimport zipfile, ioimport zipfile, ioundefinedundefinedsm = aws_client("secretsmanager")
sm.create_secret(Name="db-password", SecretString='{"password":"s3cr3t"}')
secret = sm.get_secret_value(SecretId="db-password")
print(secret["SecretString"]) # {"password":"s3cr3t"}
sm.update_secret(SecretId="db-password", SecretString='{"password":"newpass"}')
sm.delete_secret(SecretId="db-password", ForceDeleteWithoutRecovery=True)sm = aws_client("secretsmanager")
sm.create_secret(Name="db-password", SecretString='{"password":"s3cr3t"}')
secret = sm.get_secret_value(SecretId="db-password")
print(secret["SecretString"]) # {"password":"s3cr3t"}
sm.update_secret(SecretId="db-password", SecretString='{"password":"newpass"}')
sm.delete_secret(SecretId="db-password", ForceDeleteWithoutRecovery=True)ssm = aws_client("ssm")
ssm.put_parameter(Name="/app/db/host", Value="localhost", Type="String")
ssm.put_parameter(Name="/app/db/password", Value="secret", Type="SecureString")
param = ssm.get_parameter(Name="/app/db/host")
print(param["Parameter"]["Value"]) # localhostssm = aws_client("ssm")
ssm.put_parameter(Name="/app/db/host", Value="localhost", Type="String")
ssm.put_parameter(Name="/app/db/password", Value="secret", Type="SecureString")
param = ssm.get_parameter(Name="/app/db/host")
print(param["Parameter"]["Value"]) # localhostundefinedundefinedimport base64
kin = aws_client("kinesis")
kin.create_stream(StreamName="events", ShardCount=1)
kin.put_record(StreamName="events", Data=b'{"event":"click"}', PartitionKey="user1")import base64
kin = aws_client("kinesis")
kin.create_stream(StreamName="events", ShardCount=1)
kin.put_record(StreamName="events", Data=b'{"event":"click"}', PartitionKey="user1")undefinedundefinedeb = aws_client("events")eb = aws_client("events")undefinedundefinedimport time
logs = aws_client("logs")
logs.create_log_group(logGroupName="/app/service")
logs.create_log_stream(logGroupName="/app/service", logStreamName="stream-1")
logs.put_log_events(
logGroupName="/app/service",
logStreamName="stream-1",
logEvents=[
{"timestamp": int(time.time() * 1000), "message": "App started"},
{"timestamp": int(time.time() * 1000), "message": "Request received"},
],
)
events = logs.get_log_events(
logGroupName="/app/service",
logStreamName="stream-1",
)
for e in events["events"]:
print(e["message"])import time
logs = aws_client("logs")
logs.create_log_group(logGroupName="/app/service")
logs.create_log_stream(logGroupName="/app/service", logStreamName="stream-1")
logs.put_log_events(
logGroupName="/app/service",
logStreamName="stream-1",
logEvents=[
{"timestamp": int(time.time() * 1000), "message": "App started"},
{"timestamp": int(time.time() * 1000), "message": "Request received"},
],
)
events = logs.get_log_events(
logGroupName="/app/service",
logStreamName="stream-1",
)
for e in events["events"]:
print(e["message"])
---
---import pytest
import boto3
MINISTACK_ENDPOINT = "http://localhost:4566"
@pytest.fixture(scope="session")
def aws_endpoint():
return MINISTACK_ENDPOINT
@pytest.fixture
def s3_client(aws_endpoint):
return boto3.client(
"s3",
endpoint_url=aws_endpoint,
aws_access_key_id="test",
aws_secret_access_key="test",
region_name="us-east-1",
)
@pytest.fixture
def test_bucket(s3_client):
bucket = "test-bucket"
s3_client.create_bucket(Bucket=bucket)
yield bucket
# Cleanup
objs = s3_client.list_objects_v2(Bucket=bucket).get("Contents", [])
for obj in objs:
s3_client.delete_object(Bucket=bucket, Key=obj["Key"])
s3_client.delete_bucket(Bucket=bucket)
def test_upload_download(s3_client, test_bucket):
s3_client.put_object(Bucket=test_bucket, Key="test.txt", Body=b"hello")
resp = s3_client.get_object(Bucket=test_bucket, Key="test.txt")
assert resp["Body"].read() == b"hello"import pytest
import boto3
MINISTACK_ENDPOINT = "http://localhost:4566"
@pytest.fixture(scope="session")
def aws_endpoint():
return MINISTACK_ENDPOINT
@pytest.fixture
def s3_client(aws_endpoint):
return boto3.client(
"s3",
endpoint_url=aws_endpoint,
aws_access_key_id="test",
aws_secret_access_key="test",
region_name="us-east-1",
)
@pytest.fixture
def test_bucket(s3_client):
bucket = "test-bucket"
s3_client.create_bucket(Bucket=bucket)
yield bucket
# Cleanup
objs = s3_client.list_objects_v2(Bucket=bucket).get("Contents", [])
for obj in objs:
s3_client.delete_object(Bucket=bucket, Key=obj["Key"])
s3_client.delete_bucket(Bucket=bucket)
def test_upload_download(s3_client, test_bucket):
s3_client.put_object(Bucket=test_bucket, Key="test.txt", Body=b"hello")
resp = s3_client.get_object(Bucket=test_bucket, Key="test.txt")
assert resp["Body"].read() == b"hello"undefinedundefinedundefinedundefinedimport os
import boto3import os
import boto3
---
---| Service | Key Operations |
|---|---|
| S3 | CRUD, multipart, versioning, encryption, lifecycle, CORS, ACL, notifications |
| SQS | Standard & FIFO queues, DLQ, batch ops |
| SNS | Topics, subscriptions, fanout to SQS/Lambda, platform endpoints |
| DynamoDB | Tables, CRUD, Query, Scan, TTL, transactions, batch ops |
| Lambda | Python runtimes, invoke, SQS event sources, Function URLs |
| IAM | Users, roles, policies, groups, instance profiles, OIDC |
| STS | GetCallerIdentity, AssumeRole, GetSessionToken |
| SecretsManager | Full CRUD, rotation, versioning |
| SSM Parameter Store | String, SecureString, StringList, path queries |
| EventBridge | Buses, rules, targets, Lambda dispatch |
| Kinesis | Streams, shards, records, iterators |
| CloudWatch Metrics | PutMetricData, alarms, dashboards, CBOR protocol |
| CloudWatch Logs | Log groups/streams, filter with globs, metric filters |
| SES | Send email, templates, configuration sets |
| Step Functions | State machine CRUD |
| RDS | Spins up real Postgres/MySQL containers |
| ElastiCache | Spins up real Redis containers |
| Athena | Real SQL via DuckDB |
| ECS | Real Docker containers |
| 服务 | 核心功能 |
|---|---|
| S3 | CRUD、分片上传、版本控制、加密、生命周期、CORS、ACL、通知 |
| SQS | 标准&FIFO队列、死信队列、批量操作 |
| SNS | 主题、订阅、扇出到SQS/Lambda、平台端点 |
| DynamoDB | 表操作、CRUD、查询、扫描、TTL、事务、批量操作 |
| Lambda | Python运行时、调用、SQS事件源、函数URL |
| IAM | 用户、角色、策略、用户组、实例配置文件、OIDC |
| STS | GetCallerIdentity、AssumeRole、GetSessionToken |
| SecretsManager | 完整CRUD、轮换、版本控制 |
| SSM 参数存储 | String、SecureString、StringList、路径查询 |
| EventBridge | 事件总线、规则、目标、Lambda触发 |
| Kinesis | 流、分片、记录、迭代器 |
| CloudWatch 指标 | PutMetricData、告警、仪表盘、CBOR协议 |
| CloudWatch 日志 | 日志组/流、通配符过滤、指标过滤器 |
| SES | 邮件发送、模板、配置集 |
| Step Functions | 状态机CRUD |
| RDS | 启动真实Postgres/MySQL容器 |
| ElastiCache | 启动真实Redis容器 |
| Athena | 基于DuckDB的真实SQL查询 |
| ECS | 真实Docker容器 |
undefinedundefined
**`NoCredentialsError` from boto3**
```bash
export AWS_ACCESS_KEY_ID=test
export AWS_SECRET_ACCESS_KEY=test
export AWS_DEFAULT_REGION=us-east-1
**boto3抛出`NoCredentialsError`**
```bash
export AWS_ACCESS_KEY_ID=test
export AWS_SECRET_ACCESS_KEY=test
export AWS_DEFAULT_REGION=us-east-1
**`InvalidSignatureException`**
- This is usually a region mismatch. Ensure `region_name="us-east-1"` matches across all clients.
**Lambda function not found after create**
- MiniStack executes Python runtimes with a warm worker pool. Wait briefly or invoke with `InvocationType="Event"` for async.
**S3 data lost on restart**
```bash
**`InvalidSignatureException`错误**
- 通常是区域不匹配导致,请确保所有客户端的`region_name="us-east-1"`配置一致。
**创建后找不到Lambda函数**
- MiniStack使用热工作池执行Python运行时,请稍等片刻,或使用`InvocationType="Event"`异步调用。
**重启后S3数据丢失**
```bash
**Port conflict**
```bash
GATEWAY_PORT=5000 ministack
**端口冲突**
```bash
GATEWAY_PORT=5000 ministack
**Migrating from LocalStack**
- Replace all `http://localhost:4566` endpoint URLs — they stay the same.
- Remove `LOCALSTACK_AUTH_TOKEN` / `LOCALSTACK_API_KEY` env vars (not needed).
- Replace `localstack/localstack` Docker image with `nahuelnucera/ministack`.
- All `boto3` client code works without modification.
**从LocalStack迁移**
- 保留所有`http://localhost:4566`端点URL无需修改。
- 移除`LOCALSTACK_AUTH_TOKEN`/`LOCALSTACK_API_KEY`环境变量(无需使用)。
- 将`localstack/localstack`Docker镜像替换为`nahuelnucera/ministack`。
- 所有`boto3`客户端代码无需修改即可运行。