Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

BeeAI Framework is a TypeScript / Python framework from IBM for building production-grade AI agents — tools, memory, multi-step reasoning, and pluggable LLM backends. Arize AX captures every BeeAI agent run via the @arizeai/openinference-instrumentation-beeai package.

Prerequisites

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

npm install "beeai-framework@0.1.13" "@ai-sdk/openai" \
  @arizeai/openinference-instrumentation-beeai \
  @arizeai/openinference-semantic-conventions \
  @opentelemetry/exporter-trace-otlp-proto \
  @opentelemetry/instrumentation \
  @opentelemetry/resources \
  @opentelemetry/sdk-trace-base \
  @opentelemetry/sdk-trace-node \
  @opentelemetry/semantic-conventions
@arizeai/openinference-instrumentation-beeai only patches beeai-framework versions in the range >=0.1.9 <0.1.14 (declared as INSTRUMENTS in the instrumentor source); the install command above pins to 0.1.13 to stay inside that range. Newer beeai-framework releases install fine but the instrumentor’s manuallyInstrument(...) returns silently without patching, so no spans are emitted. Pin exactly rather than ^ or ~ — both resolve to the latest matching release, which today is 0.1.29 (out of range). The @ai-sdk/openai package is required because BeeAI declares it as an optional peer dependency — the example uses the OpenAI provider, so it has to be installed explicitly. Track Arize-ai/openinference for an instrumentor update that supports newer beeai-framework releases; once published, drop the version pin.

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="beeai-tracing-example"
export OPENAI_API_KEY="<your-openai-api-key>"

Setup tracing

// instrumentation.ts
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { resourceFromAttributes } from "@opentelemetry/resources";
import { SimpleSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { ATTR_SERVICE_NAME } from "@opentelemetry/semantic-conventions";
import {
  SEMRESATTRS_PROJECT_NAME,
} from "@arizeai/openinference-semantic-conventions";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import {
  BeeAIInstrumentation,
} from "@arizeai/openinference-instrumentation-beeai";
import * as beeaiFramework from "beeai-framework";

const projectName =
  process.env.ARIZE_PROJECT_NAME ?? "beeai-tracing-example";

export const provider = new NodeTracerProvider({
  resource: resourceFromAttributes({
    [ATTR_SERVICE_NAME]: projectName,
    [SEMRESATTRS_PROJECT_NAME]: projectName,
  }),
  spanProcessors: [
    new SimpleSpanProcessor(
      new OTLPTraceExporter({
        url: "https://otlp.arize.com/v1/traces",
        headers: {
          "arize-space-id": process.env.ARIZE_SPACE_ID ?? "",
          "arize-api-key": process.env.ARIZE_API_KEY ?? "",
        },
      }),
    ),
  ],
});

provider.register();

const instrumentation = new BeeAIInstrumentation();
instrumentation.manuallyInstrument(beeaiFramework);

registerInstrumentations({ instrumentations: [instrumentation] });

console.log("Arize AX tracing initialized for BeeAI.");

Run BeeAI

// example.ts

// Importing instrumentation first ensures tracing is set up before any
// BeeAI agent is created.
import { provider } from "./instrumentation";

import {
  ToolCallingAgent,
} from "beeai-framework/agents/toolCalling/agent";
import { TokenMemory } from "beeai-framework/memory/tokenMemory";
import {
  OpenAIChatModel,
} from "beeai-framework/adapters/openai/backend/chat";

// OpenAIChatModel reads OPENAI_API_KEY from the environment.
const llm = new OpenAIChatModel("gpt-5");

const agent = new ToolCallingAgent({
  llm,
  memory: new TokenMemory(),
  tools: [],
});

const response = await agent.run({
  prompt: "Why is the ocean salty? Answer in two sentences.",
});

console.log(response.result.text);

// Flush any pending spans before the process exits.
await provider.forceFlush();

Expected output

Arize AX tracing initialized for BeeAI.
The ocean is salty because rivers continuously dissolve mineral salts from rocks and soil and carry them to the sea, where they accumulate over millions of years. Water leaves the ocean through evaporation but the salts remain, steadily concentrating until reaching today's roughly 3.5% salinity.

Verify in Arize AX

  1. Open your Arize AX space and select project beeai-tracing-example.
  2. You should see a new trace within ~30 seconds with this shape: a beeai-framework-main root span (AGENT) wraps agent.toolCalling.start-1 / agent.toolCalling.success-1 (AGENT), tool.dynamic.finalAnswer.{start,finish,success}-1 (TOOL), and backend.openai.chat.{start,finish,success}-1 (LLM, model gpt-5) child spans. Prompts, responses, and token usage are attached to the LLM spans.
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize AX. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.ts. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • BeeAI spans missing. instrumentation.manuallyInstrument(beeaiFramework) must run before any agent is constructed. Make sure import { provider } from "./instrumentation" is the first import in your entry point.
  • 401 from OpenAI. Verify OPENAI_API_KEY is set and has access to gpt-5. Swap for a model your key can call.
  • Example runs but no spans appear. The instrumentor only patches beeai-framework versions in its declared INSTRUMENTS range (currently >=0.1.9 <0.1.14). Newer or older releases install fine but manuallyInstrument(...) silently no-ops. Pin to a supported version (the install command above pins to ~0.1.13), or enable OpenTelemetry diag logs (export OTEL_LOG_LEVEL=debug) to see the DependencyConflict warning the instrumentor emits at startup.
  • Cannot find package '@ai-sdk/openai' at runtime. BeeAI declares the @ai-sdk/* provider packages as optional peer dependencies — the user has to install only the ones they use. The example uses OpenAI; if you swap to Anthropic, install @ai-sdk/anthropic instead.
  • Process exits before spans flush. Spans are exported asynchronously; always await provider.forceFlush() (or provider.shutdown()) before the process exits to avoid losing trailing spans.

Resources

BeeAI Framework Documentation

OpenInference BeeAI Instrumentor (JS/TS)

BeeAI Framework GitHub