Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

Instructor is a Python library for getting Pydantic-typed structured output out of LLMs. Arize AX captures every Instructor extraction — the patched chat completion, retries, and validation errors — via the openinference-instrumentation-instructor package, alongside the underlying LLM client’s instrumentor.

Prerequisites

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

pip install arize-otel \
  openinference-instrumentation-instructor \
  openinference-instrumentation-openai \
  instructor openai

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="instructor-tracing-example"
export OPENAI_API_KEY="<your-openai-api-key>"

Setup tracing

# instrumentation.py
import os

from arize.otel import register
from openinference.instrumentation.instructor import InstructorInstrumentor
from openinference.instrumentation.openai import OpenAIInstrumentor

tracer_provider = register(
    space_id=os.environ["ARIZE_SPACE_ID"],
    api_key=os.environ["ARIZE_API_KEY"],
    project_name=os.environ["ARIZE_PROJECT_NAME"],
)

InstructorInstrumentor().instrument(tracer_provider=tracer_provider)
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
print("Arize AX tracing initialized for Instructor.")

Run Instructor

# example.py

# Importing instrumentation first ensures tracing is set up
# before `instructor` and `openai` are imported.
from instrumentation import tracer_provider

import instructor
from openai import OpenAI
from pydantic import BaseModel


class UserInfo(BaseModel):
    name: str
    age: int


client = instructor.from_openai(OpenAI())

user_info = client.chat.completions.create(
    model="gpt-5",
    response_model=UserInfo,
    messages=[
        {
            "role": "user",
            "content": "John Doe is 30 years old.",
        },
    ],
)

print(f"Name: {user_info.name}, Age: {user_info.age}")

Expected output

Arize AX tracing initialized for Instructor.
Name: John Doe, Age: 30

Verify in Arize AX

  1. Open your Arize AX space and select project instructor-tracing-example.
  2. You should see a new trace within ~30 seconds containing an instructor.patch parent span plus nested OpenAI ChatCompletion LLM spans with the prompt, response, and token usage attached. Instructor emits multiple child spans per call when it validates and (where applicable) retries.
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize AX. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.py. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • Instructor spans missing but OpenAI spans present. InstructorInstrumentor().instrument(...) must run before instructor.from_openai(...). Make sure instrumentation.py is the first import in your entry point.
  • 401 from OpenAI. Verify OPENAI_API_KEY is set and has access to gpt-5. Swap for a model your key can call.
  • Pydantic ValidationError retries. Instructor retries on validation failure (default 1); each retry is a separate child span. Increase max_retries= on the call to see them surface, or tighten the prompt to avoid the retry.

Resources

Instructor Documentation

OpenInference Instructor Instrumentor

Instructor GitHub