instrument
Original:🇺🇸 English
Translated
Add Opik tracing to an existing codebase. Detects language (Python/TypeScript), identifies LLM frameworks, adds appropriate decorators and integrations, marks entrypoints, and wires up environment config. Use for "instrument my code", "add opik tracing", "add observability", or "trace my agent".
3installs
Sourcecomet-ml/opik-skills
Added on
NPX Install
npx skill4agent add comet-ml/opik-skills instrumentTags
Translated version includes tags in frontmatterSKILL.md Content
View Translation Comparison →Instrument — Add Opik Tracing to a Codebase
You are instrumenting an existing codebase with Opik observability. Follow these steps precisely.
Step 1 — Scope
If is provided, scope your work to those files or directories. Otherwise, discover the project root and instrument the main application code.
$ARGUMENTSStep 2 — Detect Language & Frameworks
Scan the codebase to determine:
- Language: Python (look for ,
*.py,pyproject.toml) or TypeScript (look forrequirements.txt,*.ts,*.tsx)package.json - LLM frameworks in use — search imports for these patterns:
| Import pattern | Framework | Integration |
|---|---|---|
| OpenAI | |
| Anthropic | |
| LangChain | |
| LangGraph | |
| CrewAI | |
| DSPy | |
| Google Gemini | |
| AWS Bedrock | |
| LlamaIndex | |
| LiteLLM | |
| Pydantic AI | Logfire OTLP bridge |
| Google ADK | |
| Ollama | |
| OpenAI Agents SDK | |
| Haystack | |
| OpenAI (TS) | |
| Vercel AI SDK | |
| LangChain.js | |
| Gemini (TS) | |
- Existing Opik usage — check if or
opikis already imported. If so, audit rather than re-instrument.@opik.track
Step 3 — Identify the Call Graph
Find:
- Entrypoint: the top-level function that kicks off the agent (e.g., ,
main,run,agent, a route handler, or whatever the user's main orchestration function is)handle_message - LLM call sites: functions that call an LLM provider directly
- Tool functions: retrieval, search, API calls, or other tool-like operations
- Existing config classes: dataclasses, Pydantic models, or plain classes holding model names, temperatures, prompts, or other tunable parameters
Step 4 — Add Framework Integrations
For each detected framework, add the appropriate integration at the module level. See the integration table above and for the exact patterns.
references/integrations.mdPython examples:
python
# OpenAI
from opik.integrations.openai import track_openai
client = track_openai(OpenAI()) # wrap existing client
# Anthropic
from opik.integrations.anthropic import track_anthropic
client = track_anthropic(anthropic.Anthropic())
# LangChain / LangGraph
from opik.integrations.langchain import OpikTracer
tracer = OpikTracer()
# pass config={"callbacks": [tracer]} to invoke()
# LiteLLM inside @opik.track — CRITICAL: pass span context
from opik.opik_context import get_current_span_data
# in every litellm.completion() call, add:
# metadata={"opik": {"current_span_data": get_current_span_data()}}TypeScript examples:
typescript
// OpenAI
import { trackOpenAI } from "opik-openai";
const trackedClient = trackOpenAI(openai);
// Vercel AI SDK
import { OpikExporter } from "opik-vercel";
// set up NodeSDK with OpikExporterStep 5 — Add @opik.track
Decorators (Python) or Client Tracing (TypeScript)
@opik.trackPython
Add at the top of each file you instrument.
import opik| Function role | Decorator |
|---|---|
| Entrypoint (top-level agent) | |
| LLM call | |
| Tool / retrieval | |
| Guardrail / validation | |
| Other helper in the call chain | |
- Place the decorator above any existing decorators (e.g., above )
@app.route - For async functions, works the same way — no changes needed
@opik.track - If the function is a script entrypoint (not a long-running server), add after the top-level call
opik.flush_tracker()
TypeScript
Use the client-based approach:
typescript
import { Opik } from "opik";
const client = new Opik({ projectName: "<project-name>" });
// In the entrypoint function:
const trace = client.trace({ name: "<agent-name>", input: { ... } });
const span = trace.span({ name: "<operation>", type: "tool", input: { ... } });
// ... logic
span.end({ output: { ... } });
trace.end({ output: { ... } });
await client.flush();For entrypoints that should be discoverable by :
opik connecttypescript
import { track } from "opik";
const myAgent = track(
{ name: "<agent-name>", entrypoint: true, params: [{ name: "query", type: "string" }] },
async (query: string) => { /* ... */ }
);Step 6 — Conversational Agents: Add thread_id
thread_idIf the agent handles multi-turn conversations (chat bots, support agents, multi-step assistants), wire :
thread_idpython
@opik.track(entrypoint=True)
def handle_message(session_id: str, message: str) -> str:
opik.update_current_trace(thread_id=session_id)
return generate_response(session_id, message)Skip this for single-shot agents or batch processing.
Step 7 — Environment Config
Follow the setup decision tree from the main opik skill:
- If the project has /
.env→ append.env.local,OPIK_API_KEY,OPIK_WORKSPACE(if missing)OPIK_URL_OVERRIDE - If no exists → Python: create/update
.env; TypeScript: create~/.opik.configor.env.env.local - Never introduce a second config mechanism
- Never overwrite existing values
- Update /
.env.exampleif one exists.env.sample - Set in code, not in env files
project_name
Step 8 — Install Dependencies
Print the install command but do NOT run it automatically. Let the user decide.
Python:
pip install opikPlus any integration packages if needed (most are included in ).
opikTypeScript:
npm install opikPlus framework-specific packages: , , , as needed.
opik-openaiopik-vercelopik-langchainopik-geminiStep 9 — Verify
After instrumentation, do a quick audit:
- Every LLM call site is traced (via integration wrapper or )
@opik.track - Exactly one function has
entrypoint=True - Script entrypoints call (Python) or
opik.flush_tracker()(TypeScript)await client.flush() - LiteLLM calls inside pass
@opik.trackvia metadatacurrent_span_data - No hardcoded API keys were introduced
- Existing tests still import correctly (no circular imports introduced)
Anti-Patterns to Avoid
- Double-wrapping: Don't add to a function that already uses a framework integration (e.g.,
@opik.track(type="llm")). The integration handles tracing.track_openai - Orphaned LiteLLM traces: Always pass when
current_span_datais used insideOpikLoggercode.@opik.track - Missing entrypoint: Without , Local Runner (
entrypoint=True) won't discover the agent.opik connect - Missing flush: Scripts that exit without flushing lose trace data.
- Overwriting config: Check before writing to or
.env.~/.opik.config
References
For detailed API signatures and advanced patterns, see:
- — Python SDK reference
../opik/references/tracing-python.md - — TypeScript SDK reference
../opik/references/tracing-typescript.md - — All 40+ framework integrations
../opik/references/integrations.md - — Core concepts (traces, spans, threads)
../opik/references/observability.md