Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

DSPy is a Stanford framework for declarative LLM programs — Signature, Predict, and other modules describe the structure of an LLM call without hard-coding prompts. Arize AX captures the DSPy module hierarchy plus the underlying LLM calls (DSPy routes through LiteLLM by default) via the openinference-instrumentation-dspy and openinference-instrumentation-litellm packages — install both for full visibility.
https://storage.googleapis.com/arize-phoenix-assets/assets/images/phoenix-docs-images/gc.ico

DSPy Tracing Tutorial (Google Colab)

Prerequisites

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

pip install arize-otel \
  openinference-instrumentation-dspy \
  openinference-instrumentation-litellm \
  dspy

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="dspy-tracing-example"
export OPENAI_API_KEY="<your-openai-api-key>"

Setup tracing

# instrumentation.py
import os

from arize.otel import register
from openinference.instrumentation.dspy import DSPyInstrumentor
from openinference.instrumentation.litellm import LiteLLMInstrumentor

tracer_provider = register(
    space_id=os.environ["ARIZE_SPACE_ID"],
    api_key=os.environ["ARIZE_API_KEY"],
    project_name=os.environ["ARIZE_PROJECT_NAME"],
)

DSPyInstrumentor().instrument(tracer_provider=tracer_provider)
LiteLLMInstrumentor().instrument(tracer_provider=tracer_provider)
print("Arize AX tracing initialized for DSPy.")

Run DSPy

# example.py

# Importing instrumentation first ensures tracing is set up
# before `dspy` is imported.
from instrumentation import tracer_provider

import dspy

# DSPy uses LiteLLM under the hood; LiteLLM reads OPENAI_API_KEY from env.
lm = dspy.LM("openai/gpt-5")
dspy.configure(lm=lm)


class BasicQA(dspy.Signature):
    """Answer questions concisely with a one-sentence factoid."""

    question = dspy.InputField()
    answer = dspy.OutputField()


qa = dspy.Predict(BasicQA)
result = qa(question="Why is the ocean salty?")

print(result.answer)

Expected output

Arize AX tracing initialized for DSPy.
The ocean is salty because rivers carry dissolved minerals from rocks and soil into the sea, where the water evaporates but the salts remain, accumulating over millions of years.

Verify in Arize AX

  1. Open your Arize AX space and select project dspy-tracing-example.
  2. You should see a new trace within ~30 seconds containing a DSPy Predict(BasicQA).forward parent span (the BasicQA signature) wrapping LiteLLM completion child spans with the prompt, response, and token usage attached.
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize AX. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.py. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • DSPy spans appear but no LLM-call detail. DSPy delegates model calls to LiteLLM; install and instrument openinference-instrumentation-litellm (already in the install command above).
  • 401 from OpenAI. Verify OPENAI_API_KEY is set and has access to gpt-5. Swap the model string in dspy.LM("openai/<model>") for one your key can call.
  • Other LLM providers. DSPy via LiteLLM supports many providers — dspy.LM("anthropic/claude-3-5-sonnet-20241022"), dspy.LM("groq/llama-3.3-70b-versatile"), etc. The same LiteLLMInstrumentor covers them.

Resources

DSPy Documentation

OpenInference DSPy Instrumentor

OpenInference LiteLLM Instrumentor

DSPy GitHub