Skip to main content

Share Docs with AI Tools

Copy any documentation page as markdown and paste it directly into ChatGPT, Claude, coding agents, or other AI tools for accurate, up-to-date context.
Copy page button

MCP Servers

Arize AX offers 2 Model Context Protocol (MCP) servers that enhance your development workflow by integrating Arize AX support, documentation, and instrumentation guidance directly into your IDE or coding agent.

✨Start Here

Arize AX Tracing Assistant

Best for: Instrumentation guidance and implementation support
Installation →The Arize Tracing Assistant provides hands-on help with adding tracing to your applications:
  • Instrumentation Guides: Step-by-step guides for adding Arize AX tracing to your codebase
  • Curated Examples: Framework-specific tracing examples and best practices
  • Direct Support Access: Natural language queries to Arize support directly from your development environment
  • Code Analysis: Analyzes your codebase to identify instrumentation needs and propose implementation plans

Arize AX Docs MCP Server

Best for: Comprehensive documentation access and reference
The AX Docs MCP Server provides searchable access to the entire Arize AX documentation knowledge base:
  • Full Documentation Access: Search across all Arize AX documentation, guides, and references
  • Code Examples: Find relevant code examples and implementation patterns
  • API References: Access API documentation and endpoint details
  • Feature Guides: Understand how features work and locate implementation details
Best Experience: For the most comprehensive development workflow, use both MCP servers together.

Usage

Using the AX Tracing Assistant

Once the Tracing Assistant MCP server is running, you can ask your IDE or LLM natural-language questions like:
  • “Instrument this app using Arize AX”
  • “Can you use manual instrumentation so that I have more control over my traces?”
  • “How can I redact sensitive information from my spans?”
  • “Can you make sure the context of this trace is propagated across these tool calls?”
For comprehensive instrumentation assistance, use this prompt in your editor’s agent mode with the MCP server enabled:
Goal: Add Arize AX tracing to my application using the following workflow:

PHASE 1 — ANALYSIS ONLY (READ-ONLY)
- Scan the codebase to identify:
- The primary application language (Python, Java, TypeScript, or JavaScript)
- LLM calls, agent/orchestration frameworks, retrieval/embedding logic
- Existing tracing, OpenTelemetry, or logging instrumentation
- Infer the stack (LLM provider, framework, runtime)
- Identify applicable OpenInference packages based on framework, platform, and/or LLM provider
- When a framework is used, you should also include the OpenInference package for the LLM provider. The framework's OpenInference package does not include tracing for the LLM.
- Identify where manual instrumentation is needed (if any)
- Flag conflicts or overlapping instrumentation
- Propose a minimal, safe instrumentation plan

Rules for Phase 1:
- Do NOT modify code
- Do NOT create or write any files
- Do NOT generate markdown summary files
- Return a single concise summary + implementation plan

PHASE 2 — IMPLEMENTATION (ONLY AFTER PHASE 1)
- Use auto-instrumentation first; manual spans only if required
- Do NOT change business logic or behavior
- Instrumentation must be optional and fail gracefully if env vars are missing
- Create a single centralized instrumentation module (language-appropriate)
- Avoid creating unnecessary files or markdown docs

Deliverables (Phase 2, single response):
- Dependency install commands (pip / npm / pnpm / maven / gradle as applicable)
- Full instrumentation setup code (copy-paste ready)
- Where/how to initialize at application startup
- Required environment variables + .env example
- Minimal before/after code examples
- How to verify traces in Arize AX
- Troubleshooting tips

Behavior:
- Prefer inspection over mutation
- Keep output concise and production-focused
- Do not generate excessive documentation or summary files

Using the AX Docs MCP Server

Once the AX Docs MCP server is running, you can ask your IDE or LLM to search the documentation:
  • “How do I set up evaluation for my RAG application?”
  • “Show me examples of tracing with LangChain”
  • “What are the available evaluators in Arize AX?”
  • “How do I configure offline evaluations?”
The server will search across all Arize AX documentation and return relevant content with direct links to documentation pages.

Installation

The Arize AX Tracing Assistant is distributed via uv, a fast Python package manager. The AX Docs MCP Server provides HTTP-based access to the Arize AX documentation knowledge base.

Install uv (Required for Tracing Assistant only)

📦 View Tracing Assistant on PyPI The Tracing Assistant requires uv. The Docs MCP Server does not require any installation.
pip install uv
# or
brew install uv

Claude Code Integration

You can add the Tracing Assistant directly from the command line:
claude mcp add arize-tracing-assistant uvx arize-tracing-assistant@latest
Alternatively, to add it via JSON:
claude mcp add-json arize-tracing-assistant '{"command": "uvx", "args": ["arize-tracing-assistant@latest"]}'
To verify the server was added correctly:
claude mcp list

Gemini CLI Integration

Install using the Gemini CLI extension. (Gemini CLI homepage)
gemini extensions install https://github.com/Arize-ai/arize-tracing-assistant

IDE Integration

Cursor IDE

  1. Navigate to: SettingsMCP
  2. Click Add new global MCP server
  3. Insert the following into your config JSON:
    "arize-tracing-assistant": {
      "command": "uvx",
      "args": ["arize-tracing-assistant@latest"]
    }
    
Claude Desktop
  1. Open: SettingsDeveloperEdit Config
  2. Add the following config:
    "mcpServers": {
      "arize-tracing-assistant": {
        "command": "uvx",
        "args": ["arize-tracing-assistant@latest"]
      }
    }
    
Manual MCP Config
Add this snippet to your mcpServers config section:
"mcpServers": {
  "arize-tracing-assistant": {
    "command": "uvx",
    "args": ["arize-tracing-assistant@latest"]
  }
}

Validate Agent Plans with Ask AI

Use the Ask AI feature in these docs to validate implementation plans generated by your LLM or coding agent.
  1. Generate a plan with your preferred agent (ChatGPT, Claude Code, Cursor, etc.)
  2. Copy the plan from your agent’s response
  3. Paste into Ask AI next to the search bar on these docs
  4. Ask for validation to check against Arize AX best practices
Combine approaches for best results. Use MCP for real-time guidance while coding, then validate your final plan with Ask AI before deploying.