Use this file to discover all available pages before exploring further.
Auto-instrumentation covers supported frameworks. For everything else — custom logic, tool execution, unsupported frameworks, or fine-grained control — you create spans explicitly using the OpenTelemetry API with OpenInference Semantic Conventions.
# Ask your AI coding agent:"Wrap my custom tool functions with manual spans using OpenInference conventions"
Works with Cursor, Claude Code, Codex, and more. The skill picks the right span kinds, sets the right OpenInference attributes, and handles context propagation for you:
Set up the OpenTelemetry SDK, register a tracer provider with your Arize credentials, and create spans with OpenInference attributes.
For EU-region spaces, swap the endpoint to otlp.eu-west-1a.arize.com. Set ARIZE_SPACE_ID and ARIZE_API_KEY as environment variables — never embed credentials in source.
4
Create spans
Use your tracer to create spans — either as a context manager (to trace a specific block) or inline.
Python
JS/TS
Go
import openaifrom opentelemetry.trace import Status, StatusCodeclient = openai.OpenAI()def run_agent(user_input: str) -> str: with tracer.start_as_current_span("run-agent") as span: span.set_attribute("openinference.span.kind", "CHAIN") span.set_attribute("input.value", user_input) response = call_llm(user_input) span.set_attribute("output.value", response) span.set_status(Status(StatusCode.OK)) return responsedef call_llm(prompt: str) -> str: with tracer.start_as_current_span("llm-completion") as span: span.set_attribute("openinference.span.kind", "LLM") span.set_attribute("input.value", prompt) span.set_attribute("llm.model_name", "gpt-4o") completion = client.chat.completions.create( model="gpt-4o", messages=[{"role": "user", "content": prompt}], ) result = completion.choices[0].message.content or "" span.set_attribute("output.value", result) span.set_status(Status(StatusCode.OK)) return result
Go has no first-party OpenInference helper package today, so set the attribute keys (openinference.span.kind, input.value, output.value, llm.model_name) as raw strings on the span.
Q: Do I have to use an SDK that supports OpenInference?A: No; you can use any OpenTelemetry-compatible tracer. But if you instrument using the OpenInference schema (span kinds + attributes) you’ll get better integration (analytics, visualization) in Arize AX.Q: What if I’m capturing sensitive data (PII) in spans or attributes?A: When using manual instrumentation, you must handle masking, redaction, or encryption as appropriate. See Mask and redact data.