Arize AI Integration Skill
Concepts
- AI Integration = stored LLM provider credentials registered in Arize; used by evaluators to call a judge model and by other Arize features that need to invoke an LLM on your behalf
- Provider = the LLM service backing the integration (e.g., , , )
- Integration ID = a base64-encoded global identifier for an integration (e.g.,
TGxtSW50ZWdyYXRpb246MTI6YUJjRA==
); required for evaluator creation and other downstream operations
- Scoping = visibility rules controlling which spaces or users can use an integration
- Auth type = how Arize authenticates with the provider: (provider API key), (proxy via custom headers), or (bearer token auth)
Prerequisites
Three things are needed:
CLI, an API key (env var or profile), and a space ID.
Install ax
If
is not installed, not on PATH, or below version
, see ax-setup.md.
Verify environment
Run a quick check for credentials:
macOS/Linux (bash):
bash
ax --version && echo "--- env ---" && if [ -n "$ARIZE_API_KEY" ]; then echo "ARIZE_API_KEY: (set)"; else echo "ARIZE_API_KEY: (not set)"; fi && echo "ARIZE_SPACE_ID: ${ARIZE_SPACE_ID:-(not set)}" && echo "--- profiles ---" && ax profiles show 2>&1
Windows (PowerShell):
powershell
ax --version; Write-Host "--- env ---"; Write-Host "ARIZE_API_KEY: $(if ($env:ARIZE_API_KEY) { '(set)' } else { '(not set)' })"; Write-Host "ARIZE_SPACE_ID: $env:ARIZE_SPACE_ID"; Write-Host "--- profiles ---"; ax profiles show 2>&1
Read the output and proceed immediately if either the env var or the profile has an API key. Only ask the user if both are missing. Resolve failures:
- No API key in env and no profile → AskQuestion: "Arize API key (https://app.arize.com/admin > API Keys)"
- Space ID unknown → run to list all accessible spaces and pick the right one, or AskQuestion if the user prefers to provide it directly
List AI Integrations
List all integrations accessible in a space:
bash
ax ai-integrations list --space-id SPACE_ID
Filter by name (case-insensitive substring match):
bash
ax ai-integrations list --space-id SPACE_ID --name "openai"
Paginate large result sets:
bash
# Get first page
ax ai-integrations list --space-id SPACE_ID --limit 20 -o json
# Get next page using cursor from previous response
ax ai-integrations list --space-id SPACE_ID --limit 20 --cursor CURSOR_TOKEN -o json
Key flags:
| Flag | Description |
|---|
| Space to list integrations in |
| Case-insensitive substring filter on integration name |
| Max results (1–100, default 50) |
| Pagination token from a previous response |
| Output format: (default) or |
Response fields:
| Field | Description |
|---|
| Base64 integration ID — copy this for downstream commands |
| Human-readable name |
| LLM provider enum (see Supported Providers below) |
| if credentials are stored |
| Allowed model list, or if all models are enabled |
| Whether default models for this provider are allowed |
| Whether tool/function calling is enabled |
| Authentication method: , , or |
Get a Specific Integration
bash
ax ai-integrations get INT_ID
ax ai-integrations get INT_ID -o json
Use this to inspect an integration's full configuration or to confirm its ID after creation.
Create an AI Integration
Before creating, always list integrations first — the user may already have a suitable one:
bash
ax ai-integrations list --space-id SPACE_ID
If no suitable integration exists, create one. The required flags depend on the provider.
OpenAI
bash
ax ai-integrations create \
--name "My OpenAI Integration" \
--provider openAI \
--api-key "sk-..."
Anthropic
bash
ax ai-integrations create \
--name "My Anthropic Integration" \
--provider anthropic \
--api-key "sk-ant-..."
Azure OpenAI
bash
ax ai-integrations create \
--name "My Azure OpenAI Integration" \
--provider azureOpenAI \
--api-key "AZURE_API_KEY" \
--base-url "https://my-resource.openai.azure.com/"
AWS Bedrock
AWS Bedrock uses IAM role-based auth instead of an API key. Provide the ARN of the role Arize should assume:
bash
ax ai-integrations create \
--name "My Bedrock Integration" \
--provider awsBedrock \
--role-arn "arn:aws:iam::123456789012:role/ArizeBedrockRole"
Vertex AI
Vertex AI uses GCP service account credentials. Provide the GCP project and region:
bash
ax ai-integrations create \
--name "My Vertex AI Integration" \
--provider vertexAI \
--project-id "my-gcp-project" \
--location "us-central1"
Gemini
bash
ax ai-integrations create \
--name "My Gemini Integration" \
--provider gemini \
--api-key "AIza..."
NVIDIA NIM
bash
ax ai-integrations create \
--name "My NVIDIA NIM Integration" \
--provider nvidiaNim \
--api-key "nvapi-..." \
--base-url "https://integrate.api.nvidia.com/v1"
Custom (OpenAI-compatible endpoint)
bash
ax ai-integrations create \
--name "My Custom Integration" \
--provider custom \
--base-url "https://my-llm-proxy.example.com/v1" \
--api-key "optional-key-if-needed"
Supported Providers
| Provider | Required extra flags |
|---|
| |
| |
| , --base-url <azure-endpoint>
|
| |
| --project-id <gcp-project>
, |
| |
| , --base-url <nim-endpoint>
|
| |
Optional flags for any provider
| Flag | Description |
|---|
| Comma-separated list of allowed model names; omit to allow all models |
| / | Enable or disable the provider's default model list |
| / | Enable or disable tool/function calling support |
After creation
Capture the returned integration ID (e.g.,
TGxtSW50ZWdyYXRpb246MTI6YUJjRA==
) — it is needed for evaluator creation and other downstream commands. If you missed it, retrieve it:
bash
ax ai-integrations list --space-id SPACE_ID -o json
# or, if you know the ID:
ax ai-integrations get INT_ID
Update an AI Integration
is a partial update — only the flags you provide are changed. Omitted fields stay as-is.
bash
# Rename
ax ai-integrations update INT_ID --name "New Name"
# Rotate the API key
ax ai-integrations update INT_ID --api-key "sk-new-key..."
# Change the model list
ax ai-integrations update INT_ID --model-names "gpt-4o,gpt-4o-mini"
# Update base URL (for Azure, custom, or NIM)
ax ai-integrations update INT_ID --base-url "https://new-endpoint.example.com/v1"
Any flag accepted by
can be passed to
.
Delete an AI Integration
Warning: Deletion is permanent. Evaluators that reference this integration will no longer be able to run.
bash
ax ai-integrations delete INT_ID --force
Omit
to get a confirmation prompt instead of deleting immediately.
Troubleshooting
| Problem | Solution |
|---|
| See ax-setup.md |
| API key may not have access to this space. Verify key and space ID at https://app.arize.com/admin > API Keys |
| Run ax profiles show --expand
; set env var or write |
| Verify with ax ai-integrations list --space-id SPACE_ID
|
| after create | Credentials were not saved — re-run with the correct or |
| Evaluator runs fail with LLM errors | Check integration credentials with ax ai-integrations get INT_ID
; rotate the API key if needed |
| mismatch | Cannot change provider after creation — delete and recreate with the correct provider |
Related Skills
- arize-evaluator: Create LLM-as-judge evaluators that use an AI integration → use
- arize-experiment: Run experiments that use evaluators backed by an AI integration → use
Save Credentials for Future Use
At the end of the session, if the user manually provided any credentials during this conversation and those values were NOT already loaded from a saved profile or environment variable, offer to save them.
Skip this entirely if:
- The API key was already loaded from an existing profile or env var
- The space ID was already set via env var
How to offer: Use
AskQuestion:
"Would you like to save your Arize credentials so you don't have to enter them next time?" with options
/
.
If the user says yes:
-
API key — See ax-profiles.md. Run
to check the current state, then use
or
with the appropriate flags to save the key (and region if relevant).
-
Space ID — See ax-profiles.md (Space ID section) to persist it as an environment variable.