BeeAI Framework is a TypeScript / Python framework from IBM for building production-grade AI agents — tools, memory, multi-step reasoning, and pluggable LLM backends. Arize AX captures every BeeAI agent run via theDocumentation Index
Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
@arizeai/openinference-instrumentation-beeai package.
Prerequisites
- Node.js 18+
- An Arize AX account (sign up)
- An
OPENAI_API_KEYfrom the OpenAI Platform
Launch Arize AX
- Sign in to your Arize AX account.
- From Space Settings, copy your Space ID and API Key. You will set them as
ARIZE_SPACE_IDandARIZE_API_KEYbelow.
Install
@arizeai/openinference-instrumentation-beeai only patches beeai-framework versions in the range >=0.1.9 <0.1.14 (declared as INSTRUMENTS in the instrumentor source); the install command above pins to 0.1.13 to stay inside that range. Newer beeai-framework releases install fine but the instrumentor’s manuallyInstrument(...) returns silently without patching, so no spans are emitted. Pin exactly rather than ^ or ~ — both resolve to the latest matching release, which today is 0.1.29 (out of range). The @ai-sdk/openai package is required because BeeAI declares it as an optional peer dependency — the example uses the OpenAI provider, so it has to be installed explicitly. Track Arize-ai/openinference for an instrumentor update that supports newer beeai-framework releases; once published, drop the version pin.Configure credentials
Setup tracing
Run BeeAI
Expected output
Verify in Arize AX
- Open your Arize AX space and select project
beeai-tracing-example. - You should see a new trace within ~30 seconds with this shape: a
beeai-framework-mainroot span (AGENT) wrapsagent.toolCalling.start-1/agent.toolCalling.success-1(AGENT),tool.dynamic.finalAnswer.{start,finish,success}-1(TOOL), andbackend.openai.chat.{start,finish,success}-1(LLM, modelgpt-5) child spans. Prompts, responses, and token usage are attached to the LLM spans. - If no traces appear, see Troubleshooting.
Troubleshooting
- No traces in Arize AX. Confirm
ARIZE_SPACE_IDandARIZE_API_KEYare set in the same shell that runsexample.ts. Enable OpenTelemetry debug logs withexport OTEL_LOG_LEVEL=debugand re-run. - BeeAI spans missing.
instrumentation.manuallyInstrument(beeaiFramework)must run before any agent is constructed. Make sureimport { provider } from "./instrumentation"is the first import in your entry point. 401from OpenAI. VerifyOPENAI_API_KEYis set and has access togpt-5. Swap for a model your key can call.- Example runs but no spans appear. The instrumentor only patches
beeai-frameworkversions in its declaredINSTRUMENTSrange (currently>=0.1.9 <0.1.14). Newer or older releases install fine butmanuallyInstrument(...)silently no-ops. Pin to a supported version (the install command above pins to~0.1.13), or enable OpenTelemetry diag logs (export OTEL_LOG_LEVEL=debug) to see theDependencyConflictwarning the instrumentor emits at startup. Cannot find package '@ai-sdk/openai'at runtime. BeeAI declares the@ai-sdk/*provider packages as optional peer dependencies — the user has to install only the ones they use. The example uses OpenAI; if you swap to Anthropic, install@ai-sdk/anthropicinstead.- Process exits before spans flush. Spans are exported asynchronously; always
await provider.forceFlush()(orprovider.shutdown()) before the process exits to avoid losing trailing spans.