OSS Scheduled Local Sync
Plan and validate scheduled local-folder-to-OSS uploads with an aliyun-CLI-first workflow.
Scenario Description
This skill covers the case where a local folder changes over time and must be uploaded to OSS on a recurring schedule.
Architecture:
Local folder + aliyun CLI (integrated ossutil) + cron/Task Scheduler + OSS Bucket
Capability split:
- installation checks, profile verification, command discovery, OSS-side verification, and integrated upload/list commands.
- cron or Windows Task Scheduler configuration.
- RAM policy attachment and optional visual verification in the OSS Console.
Installation
Pre-check: Aliyun CLI >= 3.3.3 required
Run
to verify
. If not installed or version too low,
see
references/cli-installation-guide.md
for installation instructions.
Then run
aliyun configure set --auto-plugin-install true
.
Finally, enable AI safety mode to prevent dangerous operations:
bash
aliyun configure ai-mode enable
Required local tools:
| Tool | Required | Purpose | Verify |
|---|
| CLI | Yes | Credential gate, command discovery, and integrated upload/list surface | and |
| or | Yes | Local recurring execution | or schtasks /Query /TN "OSS Scheduled Sync"
|
Use
references/cli-installation-guide.md
only for CLI installation and plugin setup. For this skill, use the integrated
command surface — do
not require standalone
installation or bare
commands.
Environment Variables
No extra cloud-specific environment variables are required beyond an already configured Alibaba Cloud profile.
Optional local variables used in examples:
| Variable | Required/Optional | Description | Default Value |
|---|
| Optional | Select a preconfigured Alibaba Cloud CLI profile | CLI current profile |
| Optional | Absolute path to if it is not already in | |
| Optional | Log file path for scheduled execution | OS-specific local path |
Parameter Confirmation
Parameter Extraction — Extract all user-customizable parameters directly from the user's request.
When the user's message already specifies values (such as region, bucket name, paths, schedule, or MaxAge),
use those values directly without asking for re-confirmation.
Only ask the user for clarification when a required parameter is genuinely missing from their request
and cannot be reasonably inferred from context.
| Parameter Name | Required/Optional | Description | Validation Pattern | Default Value |
|---|
| Required | OSS region such as | `^[a-z]{2}-[a-z]+( | -[0-9]+)$` |
| Required | Target OSS bucket name | ^[a-z0-9][a-z0-9-]{1,61}[a-z0-9]$
| None |
| Required | Bucket-relative target OSS prefix such as (confirm without a leading ) | (no leading ) | None |
| Required | Local folder to upload | Absolute path, no , , backtick, or | None |
| Required | Cron expression or Windows schedule time/frequency | Standard 5-field cron or time | None |
| Required | window such as or | | None |
| Required | , , or | `^(linux | macos |
| Required | Whether the target bucket already exists | `^(yes | no)$` |
| Optional | Absolute path to for scheduler use | Absolute path, no , backtick, or | |
| Optional | Local log path for the scheduled job | Absolute path, no , backtick, or | OS-specific local path |
Input Validation — All parameters must be validated before use.
Treat all inputs (including values extracted from user messages) as untrusted. Before substituting any parameter into a shell command:
- Validate the value against the Validation Pattern column above. Reject values that do not match.
- must contain only lowercase letters, digits, and hyphens (), be 3–63 characters, and must not start or end with a hyphen.
- must match the Alibaba Cloud region format (e.g., , , ).
- must be a positive integer followed by (days), (hours), or (minutes).
- , , and must be absolute paths and must not contain shell metacharacters (, , , , , , , , ).
- must contain only alphanumeric characters, , , , and , and must not start with .
- If any parameter fails validation, stop and report the error to the user. Do not attempt to sanitize or escape invalid values — reject them outright.
Authentication
Pre-check: Alibaba Cloud Credentials Required
Security Rules:
- NEVER read, echo, or print AK/SK values (e.g.,
echo $ALIBABA_CLOUD_ACCESS_KEY_ID
is FORBIDDEN)
- NEVER read or credential files such as , , or any file that may contain secrets
- NEVER ask the user to input AK/SK directly in the conversation or command line
- NEVER use with literal credential values
- ONLY use to check credential status
Check the output for a valid profile (AK, STS, or OAuth identity).
If no valid profile exists, STOP here.
- Obtain credentials from Alibaba Cloud Console
- Configure credentials outside of this session (via in terminal or environment variables in shell profile)
- Return and re-run after shows a valid profile
Reuse the active CLI profile for all
commands. Do not print or hardcode secrets. Do not replace this gate with
or any other secret-entry flow.
RAM Policy
The default workflow needs least-privilege access for bucket discovery, bucket metadata verification, and object upload under the confirmed prefix.
| Scope | Required Actions |
|---|
| Account-level verification | |
| Target bucket verification | |
| Incremental upload to target prefix | , , |
| Optional test cleanup | |
Use
references/ram-policies.md
for the policy JSON and the prefix-scoped resource examples.
Do
not redefine the default minimum set around
. In this skill, the default least-privilege path stays anchored on bucket discovery, bucket metadata verification, and prefix-scoped upload/list/read actions. Multipart-cleanup permissions are follow-up work only when the user explicitly asks for them.
Core Workflow
Execute, don't just document. Run each step's commands directly in the environment. Do not only write solution documents or scripts — actually execute
,
,
, etc. against the live environment.
Extract
,
,
,
,
,
,
, and
from the user's request. Only ask the user if a required parameter is genuinely missing.
Step 1: Verify CLI and credentials
bash
aliyun version
aliyun configure list
aliyun configure ai-mode enable
Verify that:
- version is
- at least one valid profile is present
- AI safety mode is enabled (dangerous operations will be blocked)
If the version is too low or
is missing, see
references/cli-installation-guide.md
. Do not work around a missing CLI by switching to standalone
or
.
Step 2: Verify or create the bucket prerequisite
Always start by checking the candidate bucket inventory:
bash
aliyun ossutil api list-buckets --output-format json \
--read-timeout 60 --connect-timeout 30 \
--user-agent AlibabaCloud-Agent-Skills
If
, verify the selected bucket explicitly:
bash
aliyun ossutil stat "oss://${BucketName}" --region "${RegionId}" --output-format json \
--read-timeout 60 --connect-timeout 30 \
--user-agent AlibabaCloud-Agent-Skills
Cross-region note: When the active CLI profile's region (shown by
) differs from the target bucket's
, you
must add
to
,
, and
commands. Using
alone is insufficient because the request signing region must also match. The
flag overrides both the endpoint and the signing region in a single step.
What to confirm:
- the bucket name is present in the account inventory
- the bucket region matches
- the bucket is reachable with the active profile
- if multiple existing buckets can satisfy the same backup target, you can remind the user that a bucket with versioning enabled is preferable for backup safety, but this is only a recommendation and does not block using the confirmed existing bucket
If
, use the
check-then-act idempotent pattern:
- First run (above) to confirm the bucket truly does not exist in the account — if it already exists, skip creation and go directly to verification.
- Only if the bucket is confirmed absent, create it by following the existing creation flow of this skill.
- After creation, immediately re-run to verify:
bash
aliyun ossutil stat "oss://${BucketName}" --region "${RegionId}" --output-format json \
--read-timeout 60 --connect-timeout 30 \
--user-agent AlibabaCloud-Agent-Skills
Optional recommendation for recurring backup scenarios:
- if multiple candidate buckets exist and one already has versioning enabled, mention that it is preferable for backup rollback safety
- if the confirmed existing bucket does not have versioning enabled, it can still be used for this workflow; enabling versioning is only an optional hardening suggestion, not a prerequisite
Keep
as the canonical surface for upload and verification commands such as
,
, and
. For bucket creation, follow the existing creation flow already documented by this skill instead of inventing a new command family here. Do
not fabricate success, extra deployment files, or fake local artifacts just to cover a missing prerequisite.
Step 3: Run the canonical incremental upload test [aliyun CLI / integrated ossutil]
Use the official data-plane command family for the actual scheduled upload job through
:
bash
aliyun ossutil cp "${LocalSourcePath}" "oss://${BucketName}/${TargetOssPrefix}" \
-r -u \
--max-age "${MaxAge}" \
--region "${RegionId}" \
--read-timeout 300 --connect-timeout 30 \
--user-agent AlibabaCloud-Agent-Skills
Key rules for this command:
- is mandatory: uploads only when the target object is missing or the source file is newer than the existing OSS object
- must stay together as the canonical flag set
- ensures both endpoint and signing region are correct
--read-timeout 300 --connect-timeout 30
prevents the command from hanging indefinitely; adjust upward for very large files if needed
- Add only for unattended runs (cron, Task Scheduler, CI)
- Use absolute paths for (never )
- Normalize without a leading
- Do not substitute with bare , , or metadata rewrites
If
is empty, use
(with trailing slash). Otherwise use
oss://${BucketName}/${TargetOssPrefix}
after prefix normalization.
If does not exist in the current environment (e.g., container or CI runner), create it under the current working directory with a small test file, then run the upload command against it and verify with
. This proves the upload path works end-to-end. Do
not skip the upload test just because the directory is absent — create it and validate connectivity, permissions, and command correctness:
bash
mkdir -p "${LocalSourcePath}" && echo "test" > "${LocalSourcePath}/test.txt"
aliyun ossutil cp "${LocalSourcePath}" "oss://${BucketName}/${TargetOssPrefix}" \
-r -u --max-age "${MaxAge}" --region "${RegionId}" \
--read-timeout 300 --connect-timeout 30 \
--user-agent AlibabaCloud-Agent-Skills
aliyun ossutil ls "oss://${BucketName}/${TargetOssPrefix}" --region "${RegionId}" \
--read-timeout 60 --connect-timeout 30 \
--user-agent AlibabaCloud-Agent-Skills
Step 4: Wrap the upload in a local script
Minimal script template:
bash
#!/usr/bin/env bash
set -euo pipefail
ALIYUN_BIN="${ALIYUN_BIN:-aliyun}"
LOCAL_SOURCE_PATH="${LocalSourcePath}" # MUST be an absolute path, never use ~
BUCKET_NAME="${BucketName}"
TARGET_OSS_PREFIX="${TargetOssPrefix#/}"
MAX_AGE="${MaxAge}"
REGION_ID="${RegionId}"
LOG_FILE="${OSS_SYNC_LOG:-$HOME/oss-sync.log}"
READ_TIMEOUT="${READ_TIMEOUT:-600}"
CONNECT_TIMEOUT="${CONNECT_TIMEOUT:-30}"
# --- Input validation ---
[[ "${BUCKET_NAME}" =~ ^[a-z0-9][a-z0-9-]{1,61}[a-z0-9]$ ]] || { echo "ERROR: Invalid BucketName: ${BUCKET_NAME}" >&2; exit 1; }
[[ "${REGION_ID}" =~ ^[a-z]{2}-[a-z]+(|-[0-9]+)$ ]] || { echo "ERROR: Invalid RegionId: ${REGION_ID}" >&2; exit 1; }
[[ "${MAX_AGE}" =~ ^[0-9]+[dhm]$ ]] || { echo "ERROR: Invalid MaxAge: ${MAX_AGE}" >&2; exit 1; }
[[ "${TARGET_OSS_PREFIX}" =~ ^[A-Za-z0-9/_.-]*$ ]] || { echo "ERROR: Invalid TargetOssPrefix: ${TARGET_OSS_PREFIX}" >&2; exit 1; }
[[ "${LOCAL_SOURCE_PATH}" == /* ]] || { echo "ERROR: LocalSourcePath must be absolute: ${LOCAL_SOURCE_PATH}" >&2; exit 1; }
TARGET_URI="oss://${BUCKET_NAME}/"
if [ -n "${TARGET_OSS_PREFIX}" ]; then
TARGET_URI="oss://${BUCKET_NAME}/${TARGET_OSS_PREFIX}"
fi
"${ALIYUN_BIN}" ossutil cp "${LOCAL_SOURCE_PATH}" "${TARGET_URI}" \
-r -u -f \
--max-age "${MAX_AGE}" \
--region "${REGION_ID}" \
--read-timeout "${READ_TIMEOUT}" --connect-timeout "${CONNECT_TIMEOUT}" \
--user-agent AlibabaCloud-Agent-Skills >> "${LOG_FILE}" 2>&1
Note: The
flag is included in the script template because the script is intended for unattended cron/Task Scheduler execution where interactive prompts must not block the job. The
flag is preferred over
because it sets both the endpoint and signing region correctly, which is required when the CLI profile's default region differs from the target bucket's region.
Step 5: Configure the scheduler
Linux/macOS cron:
For the default Linux/macOS path in this skill, keep
/
as the documented scheduler surface. Do
not silently swap the answer to
unless the user explicitly asks for a launchd-specific variant.
If is not found: In container or minimal environments,
may not be pre-installed. Install the
package first:
- CentOS/Alibaba Cloud Linux/RHEL:
- Debian/Ubuntu:
If
fails (e.g., no systemd in containers), you can still add cron entries via
— the cron daemon is not strictly required for entry registration, only for actual execution. In such cases, document the cron entry for the user to deploy on their production host, and do
not let the missing daemon block the rest of the workflow.
Example entry (use
for non-interactive installation):
cron
0 3 * * * /usr/local/bin/oss-sync-upload.sh >> /var/log/oss-sync-cron.log 2>&1
Windows Task Scheduler via local CLI:
bat
schtasks /Create /SC DAILY /ST 03:00 /TN "OSS Scheduled Sync" /TR "C:\tools\oss-sync-upload.bat"
Label this step clearly as OS-local. It is not an Alibaba Cloud API action. Keep the scheduler output minimal and directly actionable; do not explode this step into extra README files, XML exports, PowerShell wrappers, demo payloads, or other auxiliary artifacts unless the user explicitly asks for them.
Step 6: Verify the upload target [aliyun CLI / integrated ossutil]
Always run this verification after any upload (including test uploads from Step 3):
bash
aliyun ossutil ls "oss://${BucketName}/${TargetOssPrefix}" --region "${RegionId}" \
--read-timeout 60 --connect-timeout 30 \
--user-agent AlibabaCloud-Agent-Skills
Confirm that the expected objects appear under the target prefix. Do not skip this step — it proves end-to-end connectivity and permissions.
If the user wants a manual visual check, label it clearly as
and confirm the target prefix in the OSS Console.
Step 7: State the capability boundary clearly
Always state these limitations when relevant:
- The actual incremental sync step runs through . This skill stays on the CLI surface and does not require a separate standalone installation.
- Scheduler setup is OS-local. Cron and Task Scheduler are configured on the host OS, not through Alibaba Cloud APIs.
- RAM policy attachment is typically manual or follows the user's existing IAM workflow.
- Bucket creation should happen before scheduled upload when the target bucket is missing. Follow the existing creation flow of this skill for that prerequisite.
- If multiple equivalent existing buckets are available, it is fine to remind the user that a versioning-enabled bucket is preferable for backup safety. If no versioned bucket is available, continue with the confirmed existing bucket instead of blocking the workflow.
- Optional OSS Console checks are manual.
- Do not simulate success. When a prerequisite is missing, say so plainly instead of creating fake local test data, pretend execution logs, or extra packaging artifacts.
Success Verification Method
Use
references/verification-method.md
as the authoritative checklist.
Minimum pass conditions:
- shows a valid profile.
- succeeds.
- the canonical
aliyun ossutil cp ... -r -u --max-age ... --region ...
command completes without permission or endpoint errors.
aliyun ossutil ls ... --region ...
shows the expected uploaded objects under the confirmed prefix.
- the upload command keeps , meaning it uploads only when the target object is missing or the local source file is newer than the existing OSS object.
- the local scheduler entry is visible through or Task Scheduler history/query, or is documented for the user when crontab is not available in the current environment.
Cleanup
Cleanup is optional because this skill is intended for recurring sync, but test artifacts and scheduler entries can be removed safely.
- remove the cron line with
- delete the local script and log file only if the user explicitly wants rollback
bat
schtasks /Delete /TN "OSS Scheduled Sync" /F
Optional OSS test cleanup [aliyun CLI / integrated ossutil]
:
bash
aliyun ossutil rm "oss://${BucketName}/${TargetOssPrefix}test-object.txt" --region "${RegionId}" \
--read-timeout 60 --connect-timeout 30 \
--user-agent AlibabaCloud-Agent-Skills
Do not delete the bucket or production objects unless the user explicitly asks for that cleanup scope.
After all tasks are completed, disable AI safety mode to restore normal CLI behavior:
bash
aliyun configure ai-mode disable
API and Command Tables
See
references/related-apis.md
for the command inventory, OSS capability notes, and validation notes. That file is reference metadata only.
Best Practices
- Keep for pre-checks, command discovery, bucket verification, and integrated for the actual scheduled upload.
- Use on all commands (, , , ) to ensure both endpoint and signing region are correct. This is especially important when the CLI profile's default region differs from the target bucket's region. Do not rely on alone, as it does not override the signing region and will fail with "Invalid signing region in Authorization header" errors when using STS tokens across regions.
- Keep scheduler steps labeled as OS-local so the user understands they are outside Alibaba Cloud APIs.
- Use the narrowest RAM policy possible: bucket inventory at account scope, bucket info on the target bucket, and object upload only on the confirmed prefix.
- Run and on the target machine before live execution.
- Never print AK/SK values, never hardcode them in scripts, never read credential files like , and never replace the credential gate with inline secret handling.
- If the bucket does not exist, create it first before configuring scheduled upload. If multiple existing buckets can satisfy the same backup target, you may remind the user that a versioning-enabled bucket is preferable for backup safety, but if no such bucket exists, continue with the confirmed existing bucket.
- Always use absolute paths for in commands and scripts. Do not use (tilde) because it may not expand inside quoted strings, causing "not a directory" errors.
- In generated scripts intended for cron or Task Scheduler, include the flag to prevent interactive confirmation prompts from blocking unattended execution.
Reference Links
| Reference | Description |
|---|
references/cli-installation-guide.md
| Required CLI installation guide copied from the creator skill asset |
references/verification-method.md
| Pre-check, upload, scheduler, and manual verification checklist |
references/related-apis.md
| and integrated command inventory with OSS API mapping |
references/ram-policies.md
| Least-privilege RAM policy guidance for verification and upload |
references/acceptance-criteria.md
| Correct and incorrect command patterns for this scenario |