Alva
What is Alva
Alva is an agentic finance platform. It provides unified access to 250+
financial data sources spanning crypto, equities, ETFs, macroeconomic
indicators, on-chain analytics, and social sentiment -- including spot and
futures OHLCV, funding rates, company fundamentals, price targets, insider and
senator trades, earnings estimates, CPI, GDP, Treasury rates, exchange flows,
DeFi metrics, news feeds, social media and more!
What Alva Skills Enables
The Alva skill connects any AI agent or IDE to the full Alva platform. With
it you can:
- Access financial data -- query any of Alva's 250+ data SDKs
programmatically, or bring your own data via HTTP API or direct upload.
- Run cloud-side analytics -- write JavaScript that executes on Alva Cloud
in a secure runtime. No local compute, no dependencies, no infrastructure to
manage.
- Build agentic playbooks -- create data pipelines, trading strategies, and
scheduled automations that run continuously on Alva Cloud.
- Deploy trading strategies -- backtest with the Altra trading engine and
run continuous live paper trading.
- Release and share -- turn your work into a hosted playbook web app at
https://yourusername.playbook.alva.ai/playbook-name/version/index.html
, and share it with the world.
In short: turn your ideas into a forever-running finance agent that gets things
done for you.
Capabilities & Common Workflows
1. ALFS (Alva FileSystem)
The foundation of the platform. ALFS is a globally shared filesystem with
built-in authorization. Every user has a home directory; permissions control who
can read and write each path. Scripts, data feeds, playbook assets, and shared
libraries all live on ALFS.
Key operations: read, write, mkdir, stat, readdir, remove, rename, copy,
symlink, chmod, grant, revoke.
2. JS Runtime
Run JavaScript on Alva Cloud in a secure V8 isolate. The runtime has access to
ALFS, all 250+ SDKs, HTTP networking, LLM access, and the Feed SDK. Everything
executes server-side -- nothing runs on your local machine.
3. SDKHub
250+ built-in financial data SDKs. To find the right SDK for a task, use the
two-step retrieval flow:
- Pick a partition from the index below.
- Call
GET /api/v1/sdk/partitions/:partition/modules
to see module
summaries, then load the full doc for the chosen module.
SDK Partition Index
| Partition | Description |
|---|
spot_market_price_and_volume
| Spot OHLCV for crypto and equities. Price bars, volume, historical candles. |
| Perpetual futures: OHLCV, funding rates, open interest, long/short ratio. |
| Crypto technical & on-chain indicators: MA, EMA, RSI, MACD, Bollinger, MVRV, SOPR, NUPL, whale ratio, market cap, FDV, etc. (20 modules) |
| Exchange inflow/outflow data for crypto assets. |
| Crypto market fundamentals: circulating supply, max supply, market dominance. |
| Screen crypto assets by technical metrics over custom time ranges. |
| Public companies' crypto token holdings (e.g. MicroStrategy BTC). |
| Stock fundamentals: income statements, balance sheets, cash flow, margins, PE, PB, ROE, ROA, EPS, market cap, dividend yield, enterprise value, etc. (31 modules) |
equity_estimates_and_targets
| Analyst price targets, consensus estimates, earnings guidance. |
| Dividend calendar, stock split calendar. |
equity_ownership_and_flow
| Institutional holdings, insider trades, senator trading activity. |
| Screen stocks by sector, industry, country, exchange, IPO date, earnings date, financial & technical metrics. (9 modules) |
| Stock technical indicators: beta, volatility, Bollinger, EMA, MA, MACD, RSI-14, VWAP, avg daily dollar volume. |
| ETF holdings breakdown. |
| CPI, GDP, unemployment, federal funds rate, Treasury rates, PPI, consumer sentiment, VIX, TIPS, nonfarm payroll, retail sales, recession probability, etc. (20 modules) |
technical_indicator_calculation_helpers
| 50+ pure calculation helpers: RSI, MACD, Bollinger Bands, ATR, VWAP, Ichimoku, Parabolic SAR, KDJ, OBV, etc. Input your own price arrays. |
| Social & news data feeds: news, Twitter/X, YouTube, Reddit, podcasts, web search (Brave, Grok). |
| General news and market articles. |
You can also bring your own data by uploading files to ALFS or fetching from
external HTTP APIs within the runtime.
4. Altra (Alva Trading Engine)
A feed-based event-driven backtesting engine for quantitative trading
strategies. A trading strategy IS a feed: all output data (targets, portfolio,
orders, equity, metrics) lives under a single feed's ALFS path. Altra supports
historical backtesting and continuous live paper trading, with custom
indicators, portfolio simulation, and performance analytics.
5. Deploy on Alva Cloud
Once your data analytics scripts and feeds are ready, deploy them as scheduled
cronjobs on Alva Cloud. They run continuously on your chosen schedule (e.g.
every hour, every day). Grant public access so anyone -- or any playbook page --
can read the data.
6. Build the Playbook Web App
After your data pipelines are deployed and producing data, build the playbook's
web interface. Create HTML5 pages that read from Alva's data gateway and
visualize the results. Follow the Alva Design System for styling, layout, and
component guidelines.
7. Release
Three phases:
- Write HTML to ALFS: the playbook HTML to
~/playbooks/{name}/index.html
.
- Call release API:
POST /api/v1/release/playbook
— creates DB records
and uploads HTML to CDN. Returns (numeric).
- Write ALFS files: Using the returned numeric , write
release files, draft files, and to ALFS. See
api-reference.md for details.
The
must include a
field (
or
) and a
object. Omitting
causes wrong frontend
routing; omitting
causes the dashboard iframe to never load.
Once released, the playbook is accessible at
https://yourusername.playbook.alva.ai/playbook-name/version/index.html
.
-- ready to share with the world.
Detailed sub-documents (read these for in-depth reference):
| Document | Contents |
|---|
| api-reference.md | Full REST API reference (filesystem, run, deploy, user info, time series paths) |
| jagent-runtime.md | Writing jagent scripts: module system, built-in modules, async model, constraints |
| feed-sdk.md | Feed SDK guide: creating data feeds, time series, upstreams, state management |
| altra-trading.md | Altra backtesting engine: strategies, features, signals, testing, debugging |
| deployment.md | Deploying scripts as cronjobs for scheduled execution |
| design-system.md | Alva Design System: design tokens, colors, typography, font rules |
| design-widgets.md | Widget design: chart cards, KPI cards, table cards, feed cards, layout grid |
| design-components.md | Base component templates: dropdown, button, switch, modal, select, markdown |
| design-playbook-trading-strategy.md | Trading strategy playbook guideline |
| adk.md | Agent Development Kit: API, tool calling, ReAct loop, examples |
Setup
All configuration is done via environment variables.
| Variable | Required | Description |
|---|
| yes | Your API key (create and manage at alva.ai) |
| no | Alva API base URL. Defaults to https://api-llm.prd.alva.ai
if not set |
Making API Requests
All API examples in this skill use HTTP notation (
). Every request
requires the
header unless marked
(public, no auth).
Curl templates for reference:
bash
# Authenticated
curl -s -H "X-Alva-Api-Key: $ALVA_API_KEY" "$ALVA_ENDPOINT{path}"
# Authenticated + JSON body
curl -s -H "X-Alva-Api-Key: $ALVA_API_KEY" -H "Content-Type: application/json" \
"$ALVA_ENDPOINT{path}" -d '{body}'
# Public read (no API key, absolute path)
curl -s "$ALVA_ENDPOINT{path}"
Discovering User Info
GET /api/v1/me
→ {"id":1,"username":"alice"}
Quick API Reference
See api-reference.md for full details.
Filesystem ()
| Method | Endpoint | Description |
|---|
| GET | /api/v1/fs/read?path={path}
| Read file content (raw bytes) or time series data |
| POST | | Write file (raw body or JSON with field) |
| GET | /api/v1/fs/stat?path={path}
| Get file/directory metadata |
| GET | /api/v1/fs/readdir?path={path}
| List directory entries |
| POST | | Create directory (recursive) |
| DELETE | /api/v1/fs/remove?path={path}
| Remove file or directory |
| POST | | Rename / move |
| POST | | Copy file |
| POST | | Create symlink |
| GET | /api/v1/fs/readlink?path={path}
| Read symlink target |
| POST | | Change permissions |
| POST | | Grant read/write access to a path |
| POST | | Revoke access |
Paths:
(home-relative) or
/alva/home/<username>/...
(absolute).
Public reads use absolute paths without API key.
Run ()
| Method | Endpoint | Description |
|---|
| POST | | Execute JavaScript (inline or to a script on filesystem) |
Deploy ()
| Method | Endpoint | Description |
|---|
| POST | | Create a cronjob |
| GET | | List cronjobs (paginated) |
| GET | /api/v1/deploy/cronjob/:id
| Get cronjob details |
| PATCH | /api/v1/deploy/cronjob/:id
| Update cronjob (name, cron, args) |
| DELETE | /api/v1/deploy/cronjob/:id
| Delete cronjob |
| POST | /api/v1/deploy/cronjob/:id/pause
| Pause cronjob |
| POST | /api/v1/deploy/cronjob/:id/resume
| Resume cronjob |
Release ()
| Method | Endpoint | Description |
|---|
| POST | | Register feed (DB + link to cronjob task). Call after deploying cronjob. |
| POST | | Release playbook for public hosting. Call after writing playbook HTML. |
Name uniqueness: Both
in releaseFeed and releasePlaybook must be
unique within your user space. Use
GET /api/v1/fs/readdir?path=~/feeds
or
GET /api/v1/fs/readdir?path=~/playbooks
to check existing names before
releasing.
SDK Documentation ()
| Method | Endpoint | Description |
|---|
| GET | /api/v1/sdk/doc?name={module_name}
| Get full doc for a specific SDK module |
| GET | | List all SDK partitions |
| GET | /api/v1/sdk/partitions/:partition/summary
| Get one-line summaries of all modules in a partition |
SDK retrieval flow: pick a partition from the index above → call
/partitions/:partition/summary
to see module summaries → call
to load the full doc for the chosen module.
Trading Pair Search ()
| Method | Endpoint | Description |
|---|
| GET | /api/v1/trading-pairs/search?q={q}
| Search trading pairs by base asset (fuzzy match) |
Search before writing code to check which symbols/exchanges Alva supports.
Supports exact match + prefix fuzzy search by base asset or alias.
Comma-separated queries for multiple searches.
GET /api/v1/trading-pairs/search?q=BTC,ETH
→ {"trading_pairs":[{"base":"BTC","quote":"USDT","symbol":"BINANCE_PERP_BTC_USDT","exchange":"binance","type":"crypto-perp","fee_rate":0.001,...},...]}
User Info
| Method | Endpoint | Description |
|---|
| GET | | Get authenticated user's id and username |
Runtime Modules Quick Reference
Scripts executed via
run in a V8 isolate. See
jagent-runtime.md for full details.
| Module | require() | Description |
|---|
| alfs | | Filesystem (uses absolute paths /alva/home/<username>/...
) |
| env | | , , from request |
| net/http | | for async HTTP requests |
| @alva/algorithm | require("@alva/algorithm")
| Statistics |
| @alva/feed | | Feed SDK for persistent data pipelines + FeedAltra trading engine |
| @alva/adk | | Agent SDK for LLM requests — for LLM agents with tool calling |
| @test/suite | | Jest-style test framework (, , , ) |
SDKHub: 250+ data modules available via
require("@arrays/crypto/ohlcv:v1.0.0")
etc. Version suffix is optional
(defaults to
). To discover function signatures and response shapes,
use the SDK doc API (
GET /api/v1/sdk/doc?name=...
).
Key constraints: No top-level
(wrap script in
). No Node.js builtins (
,
,
). Module
exports are frozen.
Feed SDK Quick Reference
See feed-sdk.md for full details.
Feeds are persistent data pipelines that store time series data, readable via
filesystem paths.
javascript
const { Feed, feedPath, makeDoc, num } = require("@alva/feed");
const { getCryptoKline } = require("@arrays/crypto/ohlcv:v1.0.0");
const { indicators } = require("@alva/algorithm");
const feed = new Feed({ path: feedPath("btc-ema") });
feed.def("metrics", {
prices: makeDoc("BTC Prices", "Close + EMA10", [num("close"), num("ema10")]),
});
(async () => {
await feed.run(async (ctx) => {
const raw = await ctx.kv.load("lastDate");
const lastDateMs = raw ? Number(raw) : 0;
const now = Math.floor(Date.now() / 1000);
const start =
lastDateMs > 0 ? Math.floor(lastDateMs / 1000) : now - 30 * 86400;
const bars = getCryptoKline({
symbol: "BTCUSDT",
start_time: start,
end_time: now,
interval: "1h",
})
.response.data.slice()
.reverse();
const closes = bars.map((b) => b.close);
const ema10 = indicators.ema(closes, { period: 10 });
const records = bars
.map((b, i) => ({
date: b.date,
close: b.close,
ema10: ema10[i] || null,
}))
.filter((r) => r.date > lastDateMs);
if (records.length > 0) {
await ctx.self.ts("metrics", "prices").append(records);
await ctx.kv.put("lastDate", String(records[records.length - 1].date));
}
});
})();
Feed output is readable at:
~/feeds/btc-ema/v1/data/metrics/prices/@last/100
Data Modeling Patterns
All data produced by a feed should use
+
.
Do not use
for feed output data.
Pattern A -- Snapshot (latest-wins): For data that represents current state
(company detail, ratings, price target consensus). Use start-of-day as the date
so re-runs overwrite.
javascript
const today = new Date();
today.setHours(0, 0, 0, 0);
await ctx.self
.ts("info", "company")
.append([
{ date: today.getTime(), name: company.name, sector: company.sector },
]);
Read
for current snapshot,
for 30-day history.
Pattern B -- Event log: For timestamped events (insider trades, news,
senator trades). Each event uses its natural date. Same-date records are
auto-grouped.
javascript
const records = trades.map((t) => ({
date: new Date(t.transactionDate).getTime(),
name: t.name,
type: t.type,
shares: t.shares,
}));
await ctx.self.ts("activity", "insiderTrades").append(records);
Pattern C -- Tabular (versioned batch): For data where the whole set
refreshes each run (top holders, EPS estimates). Stamp all records with the same
run timestamp; same-date grouping stores them as a batch.
javascript
const now = Date.now();
const records = holdings.map((h, i) => ({
date: now,
rank: i + 1,
name: h.name,
marketValue: h.value,
}));
await ctx.self.ts("research", "institutions").append(records);
| Data Type | Pattern | Date Strategy | Read Query |
|---|
| OHLCV, indicators | Time series (standard) | Bar timestamp | |
| Company detail, ratings | Snapshot (A) | Start of day | |
| Insider trades, news | Event log (B) | Event timestamp | |
| Holdings, estimates | Tabular (C) | Run timestamp | |
See feed-sdk.md for detailed data modeling examples
and deduplication behavior.
Deploying Feeds
Every feed follows a 6-step lifecycle:
- Write -- define schema + incremental logic with
- Upload -- write script to
~/feeds/<name>/v1/src/index.js
- Test -- with to verify output
- Grant -- make feed public via
- Deploy --
POST /api/v1/deploy/cronjob
for scheduled execution
- Release --
POST /api/v1/release/feed
to register the feed in the
database (requires the from the deploy step)
| Data Type | Recommended Schedule | Rationale |
|---|
| Stock OHLCV + technicals | (every 4h) | Markets update during trading hours |
| Company detail, price targets | (daily 8am) | Changes infrequently |
| Insider/senator trades | (daily 8am) | SEC filings are daily |
| Earnings estimates | (daily 8am) | Updated periodically |
See deployment.md for the full deployment guide and
API reference.
Debugging Feeds
Resetting Feed Data (development only)
During development, use the REST API to clear stale or incorrect data. Do not
use this in production.
# Clear a specific time series output
DELETE /api/v1/fs/remove?path=~/feeds/my-feed/v1/data/market/ohlcv&recursive=true
# Clear an entire group (all outputs under "market")
DELETE /api/v1/fs/remove?path=~/feeds/my-feed/v1/data/market&recursive=true
# Full reset: clear ALL data + KV state (removes the data mount, re-created on next run)
DELETE /api/v1/fs/remove?path=~/feeds/my-feed/v1/data&recursive=true
Inline Debug Snippets
Test SDK shapes before building a full feed:
POST /api/v1/run
{"code":"const { getCryptoKline } = require(\"@arrays/crypto/ohlcv:v1.0.0\"); JSON.stringify(Object.keys(getCryptoKline({ symbol: \"BTCUSDT\", start_time: 0, end_time: 0, interval: \"1h\" })));"}
Altra Trading Engine Quick Reference
See altra-trading.md for full details.
Altra is a feed-based event-driven backtesting engine. A trading strategy IS a
feed: all output data lives under a single ALFS path. Decisions execute at bar
CLOSE.
javascript
const { createOHLCVProvider } = require("@arrays/data/ohlcv-provider:v1.0.0");
const { FeedAltraModule } = require("@alva/feed");
const { FeedAltra, e, Amount } = FeedAltraModule;
const altra = new FeedAltra(
{
path: "~/feeds/my-strategy/v1",
startDate: Date.parse("2025-01-01T00:00:00Z"),
portfolioOptions: { initialCash: 1_000_000 },
simOptions: { simTick: "1min", feeRate: 0.001 },
perfOptions: { timezone: "UTC", marketType: "crypto" },
},
createOHLCVProvider(),
);
const dg = altra.getDataGraph();
dg.registerOhlcv("BINANCE_SPOT_BTC_USDT", "1d");
dg.registerFeature({ name: "rsi" /* ... */ });
altra.setStrategy(strategyFn, {
trigger: { type: "events", expr: e.ohlcv("BINANCE_SPOT_BTC_USDT", "1d") },
inputConfig: {
ohlcvs: [{ id: { pair: "BINANCE_SPOT_BTC_USDT", interval: "1d" } }],
features: [{ id: "rsi" }],
},
initialState: {},
});
(async () => {
await altra.run(Date.now());
})();
Deployment Quick Reference
See deployment.md for full details.
Deploy feed scripts or tasks as cronjobs for scheduled execution:
POST /api/v1/deploy/cronjob
{"path":"~/feeds/btc-ema/v1/src/index.js","cron_expression":"0 */4 * * *","name":"BTC EMA Update"}
Cronjobs execute the script via the same jagent runtime as
. Max 20
cronjobs per user. Min interval: 1 minute.
After deploying a cronjob, register the feed and release the playbook for public
hosting. The playbook HTML must already be written to ALFS at
~/playbooks/{name}/index.html
via
before releasing.
Important: Feed names and playbook names must be unique within your user
space. Before creating a new feed or playbook, use
GET /api/v1/fs/readdir?path=~/feeds
or
GET /api/v1/fs/readdir?path=~/playbooks
to check for existing names and avoid conflicts.
# 1. Release feed (register in DB, link to cronjob)
POST /api/v1/release/feed
{"name":"btc-ema","version":"1.0.0","task_id":42}
→ {"feed_id":100,"name":"btc-ema","feed_major":1}
# 2. Release playbook (uploads HTML to CDN, returns numeric playbook_id)
POST /api/v1/release/playbook
{"name":"btc-dashboard","version":"v1.0.0","description":"BTC market dashboard with price and technicals","feeds":[{"feed_id":100}]}
→ {"playbook_id":99,"version":"v1.0.0"}
# 3. Write release layout.html (CDN URL, using numeric playbook_id from step 2)
POST /api/v1/fs/write?path=~/playbooks/99/releases/v1.0.0/layout.html&mkdir_parents=true
Content-Type: application/octet-stream
Body: https://alice.playbook.alva.ai/btc-dashboard/v1.0.0/index.html
# 4. Write draft layout.html (required for frontend dashboard iframe rendering)
POST /api/v1/fs/write?path=~/playbooks/99/draft/layout.html&mkdir_parents=true
Content-Type: application/octet-stream
Body: https://alice.playbook.alva.ai/btc-dashboard/v1.0.0/index.html
# 5. Write playbook.json (must include "type" and "draft" fields)
POST /api/v1/fs/write
Content-Type: application/json
{"path":"~/playbooks/99/playbook.json","data":"{\"playbook_id\":99,\"owner_uid\":\"1\",\"type\":\"dashboard\",\"name\":\"btc-dashboard\",\"created_at\":\"2026-03-12T00:00:00Z\",\"updated_at\":\"2026-03-12T00:00:00Z\",\"draft\":{\"playbook_version_id\":0,\"updated_at\":\"2026-03-12T00:00:00Z\",\"layout_path\":\"./draft/layout.html\",\"feeds_dir\":\"./draft/feeds/\",\"feeds\":[{\"feed_id\":100,\"feed_major\":1}]},\"releases\":[{\"version\":\"v1.0.0\",\"playbook_version_id\":0,\"created_at\":\"2026-03-12T00:00:00Z\",\"layout_path\":\"./releases/v1.0.0/layout.html\",\"feeds_dir\":\"./releases/v1.0.0/feeds/\",\"feeds\":[{\"feed_id\":100,\"feed_major\":1}]}],\"latest_release\":{\"version\":\"v1.0.0\",\"playbook_version_id\":0,\"created_at\":\"2026-03-12T00:00:00Z\",\"layout_path\":\"./releases/v1.0.0/layout.html\",\"feeds_dir\":\"./releases/v1.0.0/feeds/\",\"feeds\":[{\"feed_id\":100,\"feed_major\":1}]}}","mkdir_parents":true}
The playbook will be accessible at
https://alice.playbook.alva.ai/btc-dashboard/v1.0.0/index.html
.
Alva Design System
All Alva playbook pages, dashboards, and widgets must follow the Alva Design
System. The system defines design tokens (colors, spacing, shadows), typography
rules, and component/widget templates.
Key rules:
- Font: Delight (Regular 400, Medium 500). No Semibold/Bold. Font files: Delight-Regular.ttf,
Delight-Medium.ttf
- Page background: ()
- Semantic colors: (bullish/green), (bearish/red),
(Alva theme/teal)
- Charts: Use ECharts. Select colors from the chart palette in
design-system.md. Grey only when >= 3 series.
- Widgets: No borders on widget cards. Chart cards use dotted background;
table card has no background; other cards use .
- Grid: 8-column grid (web), 4-column grid (mobile). Column spans must sum
to 8 per row.
Reference documents (read for detailed specs when building playbook web
apps):
| When | Read |
|---|
| Design tokens, typography, font rules, general guidelines | design-system.md |
| Widget types, chart/KPI/table/feed cards, grid layout | design-widgets.md |
| Component templates (button, dropdown, modal, select, switch, markdown) | design-components.md |
| Trading strategy playbook layout, sections, and content guidelines | design-playbook-trading-strategy.md |
Filesystem Layout Convention
| Path | Purpose |
|---|
| Task source code |
| Feed script source code |
| Feed synth mount (auto-created by Feed SDK) |
| Playbook web app assets |
| General data storage |
| Shared code modules |
Prefer using the Feed SDK for all data organization, including point-in-time
snapshots. Store snapshots as single-record time series rather than raw JSON
files via
. This keeps all data queryable through a single
consistent read pattern (
,
, etc.).
Common Pitfalls
- returns chronological (oldest-first) order, consistent with
and . No manual sorting needed.
- Time series reads return flat JSON records. Paths with , ,
etc. return JSON arrays of flat records like
[{"date":...,"close":...,"ema10":...}]
. Regular paths return file content
with Content-Type: application/octet-stream
.
- limits unique timestamps, not records. When multiple records
share a timestamp (grouped via ), auto-flatten may return more than
N individual records.
- The in feed paths is the synth mount. gives
, and the Feed SDK mounts storage at .
Don't name your group or you'll get .
- Public reads require absolute paths. Unauthenticated reads must use
/alva/home/<username>/...
(not ). Discover your username via .
- Top-level is not supported. Wrap async code in
.
- uses absolute paths. Inside the V8 runtime,
needs full paths like . Get your username from
.
- No Node.js builtins. , ,
do not exist. Use for files, for HTTP.
- Altra is async. returns a .
Always it:
const result = await altra.run(endDate);
- Altra decisions happen at bar CLOSE. Feature timestamps must use
, not . Using introduces look-ahead bias.
- Altra lookback: feature vs strategy. Feature lookback controls how many
bars the feature computation sees. Strategy lookback controls how many feature
outputs the strategy function sees. They are independent.
- Cronjob path must point to an existing script. The deploy API validates
the entry_path exists via filesystem stat before creating the cronjob.
- must include and ; draft ALFS files are
required. Omitting defaults to "strategy" (wrong routing for
dashboards). Omitting or the file causes the
dashboard iframe to never load.
Resource Limits
| Resource | Limit |
|---|
| Write payload | 10 MB max per request |
| HTTP response body | 128 MB max |
| Max cronjobs per user | 20 |
| Min cron interval | 1 minute |
Error Responses
All errors return:
{"error":{"code":"...","message":"..."}}
| HTTP Status | Code | Meaning |
|---|
| 400 | INVALID_ARGUMENT | Bad request or invalid path |
| 401 | UNAUTHENTICATED | Missing or invalid API key |
| 403 | PERMISSION_DENIED | Access denied |
| 404 | NOT_FOUND | File/directory not found |
| 429 | RATE_LIMITED | Rate limit / runner pool exhausted |
| 500 | INTERNAL | Server error |