Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

LangGraph is a stateful multi-actor agent framework built on top of LangChain. Arize AX captures every LangGraph run — graph node invocations, tool calls, LLM calls, and the message state passing through them — via the openinference-instrumentation-langchain package, the same instrumentor that covers LangChain.
https://storage.googleapis.com/arize-phoenix-assets/assets/images/phoenix-docs-images/gc.ico

LangGraph Tracing Tutorial (Google Colab)

Prerequisites

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

pip install arize-otel \
  openinference-instrumentation-langchain \
  langgraph langchain-openai

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="langgraph-tracing-example"
export OPENAI_API_KEY="<your-openai-api-key>"

Setup tracing

# instrumentation.py
import os

from arize.otel import register
from openinference.instrumentation.langchain import LangChainInstrumentor

tracer_provider = register(
    space_id=os.environ["ARIZE_SPACE_ID"],
    api_key=os.environ["ARIZE_API_KEY"],
    project_name=os.environ["ARIZE_PROJECT_NAME"],
)

LangChainInstrumentor().instrument(tracer_provider=tracer_provider)
print("Arize AX tracing initialized for LangGraph.")

Run LangGraph

# example.py

# Importing instrumentation first ensures tracing is set up
# before `langgraph` is imported.
from instrumentation import tracer_provider

from langchain_core.tools import tool
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent


@tool
def get_weather(city: str) -> str:
    """Get the current weather for a city. Returns a short string."""
    if city.lower() in ("sf", "san francisco"):
        return "It's 60 degrees and foggy in San Francisco."
    return f"It's 75 degrees and sunny in {city}."


# ChatOpenAI reads OPENAI_API_KEY from the environment.
agent = create_react_agent(
    model=ChatOpenAI(model="gpt-5"),
    tools=[get_weather],
)

result = agent.invoke({
    "messages": [("user", "What's the weather in San Francisco?")],
})

print(result["messages"][-1].content)

Expected output

Arize AX tracing initialized for LangGraph.
The current weather in San Francisco is 60 degrees and foggy.

Verify in Arize AX

  1. Open your Arize AX space and select project langgraph-tracing-example.
  2. You should see a new trace within ~30 seconds containing a LangGraph parent span wrapping the agent’s reasoning loop — agent node spans (ChatOpenAI calls), the tools node span (get_weather invocation), and the final answer.
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize AX. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.py. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • Graph ran but no spans appear. LangChainInstrumentor().instrument(...) must run before any langgraph or langchain import. Make sure instrumentation.py is the first import in your entry point.
  • 401 from OpenAI. Verify OPENAI_API_KEY is set and has access to gpt-5. Swap for a model your key can call.
  • Other LLM providers. Install the matching langchain-<provider> package (e.g. langchain-anthropic) and pass that chat model to create_react_agent. The same LangChainInstrumentor covers every provider.

Resources

LangGraph Documentation

OpenInference LangChain Instrumentor (used for LangGraph)

LangChain Tracing Guide