Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

LangChain.js is the JavaScript/TypeScript port of LangChain — a framework for composing LLM calls, tools, and retrieval into chains and agents. Arize AX captures every chain, prompt, tool call, and LLM call by manually instrumenting the @langchain/core/callbacks/manager module via the @arizeai/openinference-instrumentation-langchain package.

Prerequisites

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

npm install @langchain/core @langchain/openai \
  @arizeai/openinference-instrumentation-langchain \
  @arizeai/openinference-semantic-conventions \
  @opentelemetry/api \
  @opentelemetry/exporter-trace-otlp-proto \
  @opentelemetry/instrumentation \
  @opentelemetry/resources \
  @opentelemetry/sdk-trace-base \
  @opentelemetry/sdk-trace-node \
  @opentelemetry/semantic-conventions

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="langchain-js-tracing-example"
export OPENAI_API_KEY="<your-openai-api-key>"

Setup tracing

// instrumentation.ts
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
import { resourceFromAttributes } from "@opentelemetry/resources";
import { SimpleSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { ATTR_SERVICE_NAME } from "@opentelemetry/semantic-conventions";
import {
  SEMRESATTRS_PROJECT_NAME,
} from "@arizeai/openinference-semantic-conventions";
import {
  LangChainInstrumentation,
} from "@arizeai/openinference-instrumentation-langchain";
import * as CallbackManagerModule from "@langchain/core/callbacks/manager";

const projectName =
  process.env.ARIZE_PROJECT_NAME ?? "langchain-js-tracing-example";

export const provider = new NodeTracerProvider({
  resource: resourceFromAttributes({
    [ATTR_SERVICE_NAME]: projectName,
    [SEMRESATTRS_PROJECT_NAME]: projectName,
  }),
  spanProcessors: [
    new SimpleSpanProcessor(
      new OTLPTraceExporter({
        url: "https://otlp.arize.com/v1/traces",
        headers: {
          "arize-space-id": process.env.ARIZE_SPACE_ID ?? "",
          "arize-api-key": process.env.ARIZE_API_KEY ?? "",
        },
      }),
    ),
  ],
});

provider.register();

const instrumentation = new LangChainInstrumentation();
instrumentation.manuallyInstrument(CallbackManagerModule);

console.log("Arize AX tracing initialized for LangChain.js.");

Run LangChain.js

// example.ts

// Importing instrumentation first ensures tracing is set up before any
// LangChain client is created.
import { provider } from "./instrumentation";

import { ChatOpenAI } from "@langchain/openai";
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { StringOutputParser } from "@langchain/core/output_parsers";

// ChatOpenAI reads OPENAI_API_KEY from the environment.
const model = new ChatOpenAI({ model: "gpt-5" });
const prompt = ChatPromptTemplate.fromTemplate(
  "Answer the question concisely.\nQuestion: {question}\nAnswer:",
);
const chain = prompt.pipe(model).pipe(new StringOutputParser());

const result = await chain.invoke({
  question: "Why is the ocean salty? Answer in two sentences.",
});

console.log(result);

// Flush any pending spans before the process exits.
await provider.forceFlush();

Expected output

Arize AX tracing initialized for LangChain.js.
The ocean is salty because rivers continuously dissolve mineral salts from rocks and soil and carry them to the sea, where they accumulate over millions of years. Water leaves the ocean through evaporation but the salts remain, steadily concentrating until reaching today's roughly 3.5% salinity.

Verify in Arize AX

  1. Open your Arize AX space and select project langchain-js-tracing-example.
  2. You should see a new trace within ~30 seconds containing a RunnableSequence parent span (CHAIN) wrapping ChatPromptTemplate (CHAIN), ChatOpenAI (LLM, model gpt-5), and StrOutputParser (CHAIN) child spans, with the prompt, response, and token usage attached to the LLM span.
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize AX. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.ts. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • LangChain spans missing but other spans present. instrumentation.manuallyInstrument(CallbackManagerModule) must run before any code creates a LangChain client. Make sure import { provider } from "./instrumentation" (or a side-effect-only import "./instrumentation") is the first import in your entry point.
  • 401 from OpenAI. Verify OPENAI_API_KEY is set and has access to gpt-5. Swap for a model your key can call.
  • Process exits before spans flush. Spans are exported asynchronously; always await provider.forceFlush() (or provider.shutdown()) before the process exits to avoid losing trailing spans.

Version compatibility

Instrumentation >=1.0.0 supports both attribute masking and context attribute propagation. The matrix below tracks instrumentor support across LangChain core releases:
Instrumentation VersionLangChain ^0.3.0LangChain ^0.2.0LangChain ^0.1.0
>=1.0.0YesYesYes
>=0.2.0NoYesYes
>=0.1.0NoNoYes

Resources

LangChain.js Documentation

OpenInference LangChain Instrumentor (JS/TS)

LangChain.js GitHub