/v1 endpoint that mirrors OpenAI’s schema, making it fully compatible with OpenAI SDKs and OpenInference auto-instrumentation.
Prerequisites
- OpenRouter account and API key
- Arize account with Space ID and API Key
Why OpenRouter Works with OpenInference
Arize’s OpenInference auto-instrumentation works with OpenRouter because:- OpenRouter provides a fully OpenAI-API-compatible endpoint - The
/v1endpoint mirrors OpenAI’s schema - Reuse official OpenAI SDKs - Point the OpenAI client’s
base_urlto OpenRouter - Automatic instrumentation - OpenInference hooks into OpenAI SDK calls seamlessly
Install
Setup
- Set your OpenRouter API key:
- Initialize Arize and instrument OpenAI:
- Configure OpenAI client for OpenRouter:
- Make traced calls:
What Gets Traced
All OpenRouter model calls are automatically traced and include:- Request/response data and timing
- Model name and provider information
- Token usage and cost data (when supported)
- Error handling and debugging information
JavaScript/TypeScript Support
OpenInference also provides instrumentation for the OpenAI JS/TS SDK, which works with OpenRouter. For setup and examples, please refer to the OpenInference JS examples for OpenAI.Common Issues
- API Key: Use your OpenRouter API key, not OpenAI’s
- Model Names: Use exact model names from OpenRouter’s documentation
- Rate Limits: Check your OpenRouter dashboard for usage limits