Google Colab
tutorials/python/llm/tracing/crewai/crewai-tracing.ipynb at main · Arize-ai/tutorials
GitHub
CrewAI 1.0+ changed how LLM calls are routed. Starting with CrewAI 1.0 (GA October 2025), the framework routes LLM calls to native provider SDKs (OpenAI, Anthropic, etc.) based on the model string prefix. LiteLLM is only used as a fallback for providers without native support. This means the LLM-level instrumentor you install must match the provider CrewAI is actually calling. This page covers both CrewAI 1.0+ and 0.x instrumentation.
Launch Arize
To get started, sign up for a free Arize account and get your Space ID and API Key.Install
- CrewAI 1.0+
- CrewAI 0.x (Legacy)
The example above uses
openinference-instrumentation-openai, which covers most default configurations. If your agents use anthropic/ prefixed models, install openinference-instrumentation-anthropic instead. For providers that fall back to LiteLLM (Groq, Together, etc.), use openinference-instrumentation-litellm.API Key Setup
Your CrewAI agents will likely require API keys for the LLMs and tools they use. Configure these as environment variables. The example below uses OpenAI and SerperDevTool.Which Instrumentor Do I Need?
CrewAI 1.0+ routes LLM calls to different provider SDKs based on your model string. Match your instrumentor to the provider CrewAI is calling:| Model string | Provider SDK used | Instrumentor package |
|---|---|---|
openai/gpt-4o, gpt-4o, or no prefix | OpenAI (native) | openinference-instrumentation-openai |
anthropic/claude-sonnet-4-6 | Anthropic (native) | openinference-instrumentation-anthropic |
| Groq, Together, or other unsupported providers | LiteLLM (fallback) | openinference-instrumentation-litellm |
openinference-instrumentation-crewai) is always required regardless of provider. It captures Crew, Agent, and Task spans.
Setup Tracing
- CrewAI 1.0+
- CrewAI 0.x (Legacy)
The
tracer_provider= kwarg on each .instrument() call is optional when register() has already set the global TracerProvider. Omitting it is cleaner and avoids bugs where different instrumentors accidentally receive different provider instances.Run CrewAI Example
Define your agents and tasks, then kick off the crew. This example assumes instrumentation has already been configured in the Setup Tracing step above.Observe
Traces from CrewAI operations (Crew, Agent, and Task spans) and the underlying LLM calls will be streamed to your Arize account. This provides a comprehensive view of your multi-agent system.Troubleshooting
I see Crew/Agent/Task spans but no LLM spans
This is the most common issue with CrewAI 1.0+. It means the LLM instrumentor does not match the provider CrewAI is routing to. For example, if your model string isopenai/gpt-4o, CrewAI calls the native OpenAI SDK directly and never touches LiteLLM — so openinference-instrumentation-litellm will see nothing.
Fix: Check your model string prefix and install the matching instrumentor (see Which Instrumentor Do I Need? above).
Spans are missing entirely (no Crew spans, no LLM spans)
This is almost always an import-order problem. Both the CrewAI instrumentor (which useswrapt to wrap class methods) and LLM-level instrumentors (which use setattr to replace module-level functions) must run before the target library is imported.
Fix: Make sure all .instrument() calls execute before any from crewai import ... line.
A common anti-pattern: if your instrumentation lives in a separate module (e.g., tracing.py), make sure it is imported and executed before any module that imports crewai. For example, this breaks:
Version check errors with new CrewAI releases
If you are using a release candidate or very new CrewAI version and the instrumentor fails a dependency check, passskip_dep_check=True: