BeeAI Framework is a Python / TypeScript framework from IBM for building production-grade AI agents — tools, memory, multi-step reasoning, and pluggable LLM backends. Arize AX captures every BeeAI agent run via theDocumentation Index
Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
openinference-instrumentation-beeai package.
Prerequisites
- Python 3.11+
- An Arize AX account (sign up)
- An
OPENAI_API_KEYfrom the OpenAI Platform
Launch Arize AX
- Sign in to your Arize AX account.
- From Space Settings, copy your Space ID and API Key. You will set them as
ARIZE_SPACE_IDandARIZE_API_KEYbelow.
Install
Configure credentials
Setup tracing
Run BeeAI
Expected output
Verify in Arize AX
- Open your Arize AX space and select project
beeai-tracing-example. - You should see a new trace within ~30 seconds with this shape: a
RequirementAgentroot span (AGENT) wraps anOpenAIChatModelLLM child span (modelgpt-5, prompt + response + token usage attached) and afinal_answertool span. - If no traces appear, see Troubleshooting.
Troubleshooting
- No traces in Arize AX. Confirm
ARIZE_SPACE_IDandARIZE_API_KEYare set in the same shell that runsexample.py. Enable OpenTelemetry debug logs withexport OTEL_LOG_LEVEL=debugand re-run. - BeeAI spans missing but other spans present.
BeeAIInstrumentor().instrument(...)must run before anyfrom beeai_framework import .... Make sureinstrumentation.pyis the first import in your entry point. 401from OpenAI. VerifyOPENAI_API_KEYis set and has access togpt-5. Swap theopenai:gpt-5slug inChatModel.from_name(...)for a model your key can call.- Other LLM providers. BeeAI delegates model calls to LiteLLM, so any LiteLLM-supported provider works —
ChatModel.from_name("anthropic:claude-3-5-sonnet-20241022"),ChatModel.from_name("groq:llama-3.3-70b-versatile"),ChatModel.from_name("ollama:granite3.1-dense:8b"), etc. The sameBeeAIInstrumentorcovers them.