Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

Amazon Bedrock is AWS’s managed foundation-model service — Claude, Llama, Mistral, Titan, and others are reachable through a single boto3 client. Arize AX captures every Bedrock model call (invoke_model, converse, converse_stream) via the openinference-instrumentation-bedrock package.
https://storage.googleapis.com/arize-phoenix-assets/assets/images/phoenix-docs-images/gc.ico

Bedrock Tracing Tutorial (Google Colab)

Prerequisites

  • Python 3.10+
  • An Arize AX account (sign up)
  • An AWS account with Bedrock model access enabled for the model you want to call (the example below uses Anthropic Claude Sonnet 4.6 — request access from the Bedrock console under Model access if you haven’t already)

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

pip install arize-otel openinference-instrumentation-bedrock boto3

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="amazon-bedrock-tracing-example"

# AWS credentials — long-lived or SSO/STS temporary.
export AWS_ACCESS_KEY_ID="<your-aws-access-key-id>"
export AWS_SECRET_ACCESS_KEY="<your-aws-secret-access-key>"
export AWS_REGION="us-east-1"

# Only required when using SSO / STS / federated logins. Leave unset
# for long-lived IAM user keys.
export AWS_SESSION_TOKEN=""  # optional

Setup tracing

# instrumentation.py
import os

from arize.otel import register
from openinference.instrumentation.bedrock import BedrockInstrumentor

tracer_provider = register(
    space_id=os.environ["ARIZE_SPACE_ID"],
    api_key=os.environ["ARIZE_API_KEY"],
    project_name=os.environ["ARIZE_PROJECT_NAME"],
)

BedrockInstrumentor().instrument(tracer_provider=tracer_provider)
print("Arize AX tracing initialized for Amazon Bedrock.")

Run Amazon Bedrock

# example.py

# Importing instrumentation first ensures BedrockInstrumentor patches
# boto3 before the bedrock-runtime client is created.
from instrumentation import tracer_provider

import os

import boto3

# boto3 reads AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY /
# AWS_SESSION_TOKEN (optional) / AWS_REGION from the environment.
client = boto3.client(
    "bedrock-runtime",
    region_name=os.environ.get("AWS_REGION", "us-east-1"),
)

# Cross-region inference profile for Claude Sonnet 4.6. The `us.` prefix
# tells Bedrock to route across US regions automatically. Drop the
# prefix and use `anthropic.claude-sonnet-4-6` for region-pinned calls.
response = client.converse(
    modelId="us.anthropic.claude-sonnet-4-6",
    messages=[
        {
            "role": "user",
            "content": [
                {"text": "Why is the ocean salty? Answer in two sentences."}
            ],
        }
    ],
    inferenceConfig={"maxTokens": 256, "temperature": 0.0},
)

print(response["output"]["message"]["content"][0]["text"])

Expected output

Arize AX tracing initialized for Amazon Bedrock.
The ocean is salty because rivers continuously dissolve mineral salts from rocks and soil and carry them to the sea, where they accumulate over millions of years. Water leaves the ocean through evaporation but the salts remain, steadily concentrating until reaching today's roughly 3.5% salinity.

Verify in Arize AX

  1. Open your Arize AX space and select project amazon-bedrock-tracing-example.
  2. You should see a new trace within ~30 seconds containing a bedrock.converse LLM span with the prompt, response, and token usage attached. The span’s llm.model_name is the model id you called (e.g. us.anthropic.claude-sonnet-4-6).
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • No traces in Arize AX. Confirm ARIZE_SPACE_ID and ARIZE_API_KEY are set in the same shell that runs example.py. Enable OpenTelemetry debug logs with export OTEL_LOG_LEVEL=debug and re-run.
  • Bedrock spans missing but other spans present. BedrockInstrumentor().instrument(...) must run before boto3.client("bedrock-runtime", ...) is called. Make sure instrumentation.py is the first import in your entry point — boto3 clients created before instrumentation aren’t patched.
  • AccessDeniedException / Could not assume role. Your IAM principal doesn’t have bedrock:InvokeModel permission, or model access isn’t enabled for the model id in the example. Enable access in the Bedrock console under Model access and confirm your IAM policy grants bedrock:InvokeModel on arn:aws:bedrock:*::foundation-model/*.
  • ValidationException: Invocation of model ID anthropic.claude-sonnet-4-6 ... isn't supported. Some Claude models on Bedrock are only available through cross-region inference profiles. Prefix the model id with your geography slug — us.anthropic.claude-sonnet-4-6 (the example uses this) or eu.anthropic.claude-sonnet-4-6.
  • ExpiredTokenException. Your AWS_SESSION_TOKEN (SSO / STS temporary credentials) has expired. Re-run the SSO login and re-export the new triple of AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_SESSION_TOKEN.
  • Meta Llama spans missing with invoke_model. The instrumentor doesn’t currently capture Llama responses via the invoke_model API — use converse (which the example above already does) for any non-Anthropic model.

Resources

Amazon Bedrock Documentation

OpenInference Bedrock Instrumentor

boto3 (AWS SDK for Python)

Bedrock Examples (Converse, Streaming, Tools, Agents)