Skip to main content

Documentation Index

Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt

Use this file to discover all available pages before exploring further.

AWS Bedrock AgentCore is a managed runtime for deploying agentic applications — your agent code runs as a containerized service AWS provisions and operates. AgentCore is framework-agnostic and works with Strands Agents, CrewAI, LangGraph, LlamaIndex, Google ADK, and the OpenAI Agents SDK. Arize AX captures every invocation by configuring the AgentCore-deployed agent to ship OTLP traces to Arize AX through the same openinference-instrumentation-strands-agents processor used by the Strands Agents SDK guide.
This is a deployment workflow, not a local-script example. The agent file is shipped to AWS via the AgentCore starter toolkit, which provisions an ECR repository, builds and pushes the agent container, creates an IAM execution role, and launches the runtime. Local invocations target the deployed runtime over the network.

Bedrock AgentCore Runtime with Strands and Arize AX Notebook

Prerequisites

  • Python 3.10+
  • An Arize AX account (sign up)
  • An AWS account with:
    • Bedrock model access enabled for the foundation model your agent uses
    • Bedrock AgentCore available in your AWS region. AgentCore is currently available in a subset of regions — check the Bedrock AgentCore docs for the current list.
    • IAM permissions to create ECR repositories, execution roles, and AgentCore runtimes (the starter toolkit’s auto_create_* options handle the provisioning, but your principal must allow it).
  • Docker running locally — the starter toolkit builds the agent container image on your machine before pushing to ECR.

Launch Arize AX

  1. Sign in to your Arize AX account.
  2. From Space Settings, copy your Space ID and API Key. You will set them as ARIZE_SPACE_ID and ARIZE_API_KEY below.

Install

pip install bedrock-agentcore bedrock-agentcore-starter-toolkit \
  openinference-instrumentation-strands-agents \
  strands-agents
Add the runtime-side dependencies to your project’s requirements.txt — the starter toolkit ships these into the AgentCore container at deploy time:
bedrock-agentcore
openinference-instrumentation-strands-agents
strands-agents
opentelemetry-api
opentelemetry-sdk
opentelemetry-exporter-otlp-proto-grpc

Configure credentials

export ARIZE_SPACE_ID="<your-space-id>"
export ARIZE_API_KEY="<your-api-key>"
export ARIZE_PROJECT_NAME="bedrock-agentcore-tracing-example"

# AWS credentials — long-lived or SSO/STS temporary.
export AWS_ACCESS_KEY_ID="<your-aws-access-key-id>"
export AWS_SECRET_ACCESS_KEY="<your-aws-secret-access-key>"
export AWS_REGION="us-west-2"   # AgentCore region — check service availability
export AWS_SESSION_TOKEN=""     # optional — only for SSO / STS / federated

Setup tracing

Create the agent’s runtime file — this is what AgentCore packages into the deployable container. The OTel + OpenInference setup looks the same as the Strands Agents SDK guide, with one runtime-specific quirk: AgentCore registers its own tracer provider by default, so you have to disable AgentCore’s built-in OTel (disable_otel=True at configure time and DISABLE_ADOT_OBSERVABILITY=true in the launch env) and explicitly call trace.set_tracer_provider(...) to install the Arize-bound provider.
# strands_claude.py — the file AgentCore deploys as your agent
import os

from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
    OTLPSpanExporter,
)

from bedrock_agentcore.runtime import BedrockAgentCoreApp
from strands import Agent
from strands.models.bedrock import BedrockModel
from openinference.instrumentation.strands_agents import (
    StrandsAgentsToOpenInferenceProcessor,
)

# Build a TracerProvider with the Arize project name baked into the
# Resource and register it globally so Strands' internal
# `trace.get_tracer(...)` finds it.
resource = Resource.create({
    "openinference.project.name": os.environ["ARIZE_PROJECT_NAME"],
    "service.name": "bedrock-agentcore-strands-agent",
})
provider = TracerProvider(resource=resource)
provider.add_span_processor(StrandsAgentsToOpenInferenceProcessor())

# The OTLP gRPC exporter reads endpoint + headers from
# OTEL_EXPORTER_OTLP_ENDPOINT and OTEL_EXPORTER_OTLP_HEADERS — both
# set at launch() time below.
provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter()))
trace.set_tracer_provider(provider)


# The deployed app — AgentCore calls strands_agent_bedrock for every
# /invocations POST.
app = BedrockAgentCoreApp()


@app.entrypoint
def strands_agent_bedrock(payload, context=None):
    agent = Agent(
        name="OceanAssistant",
        model=BedrockModel(
            model_id="us.anthropic.claude-sonnet-4-6",
            region_name=os.environ.get("AWS_DEFAULT_REGION", "us-west-2"),
        ),
        system_prompt="Answer concisely in two sentences.",
    )
    response = agent(payload.get("prompt", ""))
    return response.message["content"][0]["text"]


if __name__ == "__main__":
    app.run()

Run Bedrock AgentCore

Configure and launch the AgentCore deployment. The toolkit auto-creates the ECR repo and the execution role on first run.
# deploy.py — runs locally; ships strands_claude.py to AgentCore
import os

from bedrock_agentcore_starter_toolkit import Runtime
from boto3.session import Session

region = Session().region_name

agentcore_runtime = Runtime()

agentcore_runtime.configure(
    entrypoint="strands_claude.py",
    auto_create_execution_role=True,
    auto_create_ecr=True,
    requirements_file="requirements.txt",
    region=region,
    agent_name="strands_agentcore_arize_observability",
    memory_mode="NO_MEMORY",
    # Disable AgentCore's built-in OTel — we're wiring our own
    # TracerProvider in strands_claude.py.
    disable_otel=True,
)

# Inject the Arize OTLP endpoint + auth headers as runtime env vars.
# The deployed container reads these and routes spans to Arize.
otlp_headers = (
    f"arize-space-id={os.environ['ARIZE_SPACE_ID']},"
    f"authorization={os.environ['ARIZE_API_KEY']},"
    f"arize-interface=python"
)

agentcore_runtime.launch(
    env_vars={
        "ARIZE_PROJECT_NAME":            os.environ["ARIZE_PROJECT_NAME"],
        "OTEL_EXPORTER_OTLP_ENDPOINT":   "https://otlp.arize.com:443",
        "OTEL_EXPORTER_OTLP_HEADERS":    otlp_headers,
        "OTEL_EXPORTER_OTLP_PROTOCOL":   "grpc",
        # AWS recommends this when using a non-AWS observability
        # backend so AgentCore doesn't double-export to CloudWatch.
        "DISABLE_ADOT_OBSERVABILITY":    "true",
    },
)

# Invoke the deployed runtime. agentcore_runtime.invoke() POSTs to
# the AgentCore Runtime endpoint and returns the response.
result = agentcore_runtime.invoke(
    {"prompt": "Why is the ocean salty? Answer in two sentences."}
)
print(result)

Expected output

The ocean is salty because rivers continuously dissolve mineral salts from rocks and soil and carry them to the sea, where they accumulate over millions of years. Water leaves the ocean through evaporation but the salts remain, steadily concentrating until reaching today's roughly 3.5% salinity.

Verify in Arize AX

  1. Open your Arize AX space and select project bedrock-agentcore-tracing-example.
  2. You should see a new trace within ~30–60 seconds (AgentCore cold-start adds latency on first invocation) with the same shape as the Strands tracing guide: an invoke_agent <agentName> root span (AGENT) wrapping an execute_event_loop_cycle (CHAIN) and a chat LLM span (model us.anthropic.claude-sonnet-4-6).
  3. If no traces appear, see Troubleshooting.

Troubleshooting

  • Overriding of current TracerProvider is not allowed. AgentCore registers its own TracerProvider on startup, and OTel disallows overriding a registered provider. Pass disable_otel=True to agentcore_runtime.configure(...) and set DISABLE_ADOT_OBSERVABILITY=true in env_vars at launch(...). With both set, your trace.set_tracer_provider(provider) in strands_claude.py succeeds and your Arize-bound provider becomes the global tracer.
  • Failed to export traces to https://otlp.arize.com:443 (StatusCode.PERMISSION_DENIED). The OTLP env vars aren’t reaching the deployed container, or the headers aren’t formatted correctly. AgentCore’s container-side environment expects OTEL_EXPORTER_OTLP_HEADERS as a comma-separated key=value string (no quoting, no JSON). Confirm the env vars in the launch(...) call match the format in the Run section above.
  • AccessDeniedException from Bedrock at agent invocation time. AgentCore’s auto-created execution role gets a default trust policy + minimal permissions; you may need to attach bedrock:InvokeModel and bedrock:InvokeModelWithResponseStream to it manually. The role ARN is in the agentcore_runtime.configure(...) return value.
  • Cold-start spans missing. The first invoke after launch can take 60–120s while the container provisions. Spans for that invocation will land once the agent’s BatchSpanProcessor flushes, which doesn’t happen until the next invocation. To force a flush on a long-lived runtime, call provider.force_flush() at the end of your @app.entrypoint function.
  • Region mismatch. BedrockModel(region_name=...) and the AgentCore Runtime region must agree, or the model call fails with ValidationException. Set both from AWS_DEFAULT_REGION or pass the same string to both.

Resources

Bedrock AgentCore Documentation

AWS AgentCore Sample Notebooks

OpenInference Strands Instrumentor

Strands Agents SDK (local tracing guide)