AWS Bedrock AgentCore is a managed runtime for deploying agentic applications — your agent code runs as a containerized service AWS provisions and operates. AgentCore is framework-agnostic and works with Strands Agents, CrewAI, LangGraph, LlamaIndex, Google ADK, and the OpenAI Agents SDK. Arize AX captures every invocation by configuring the AgentCore-deployed agent to ship OTLP traces to Arize AX through the sameDocumentation Index
Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
openinference-instrumentation-strands-agents processor used by the Strands Agents SDK guide.
This is a deployment workflow, not a local-script example. The agent file is shipped to AWS via the AgentCore starter toolkit, which provisions an ECR repository, builds and pushes the agent container, creates an IAM execution role, and launches the runtime. Local invocations target the deployed runtime over the network.
Bedrock AgentCore Runtime with Strands and Arize AX Notebook
Prerequisites
- Python 3.10+
- An Arize AX account (sign up)
- An AWS account with:
- Bedrock model access enabled for the foundation model your agent uses
- Bedrock AgentCore available in your AWS region. AgentCore is currently available in a subset of regions — check the Bedrock AgentCore docs for the current list.
- IAM permissions to create ECR repositories, execution roles, and AgentCore runtimes (the starter toolkit’s
auto_create_*options handle the provisioning, but your principal must allow it).
- Docker running locally — the starter toolkit builds the agent container image on your machine before pushing to ECR.
Launch Arize AX
- Sign in to your Arize AX account.
- From Space Settings, copy your Space ID and API Key. You will set them as
ARIZE_SPACE_IDandARIZE_API_KEYbelow.
Install
requirements.txt — the starter toolkit ships these into the AgentCore container at deploy time:
Configure credentials
Setup tracing
Create the agent’s runtime file — this is what AgentCore packages into the deployable container. The OTel + OpenInference setup looks the same as the Strands Agents SDK guide, with one runtime-specific quirk: AgentCore registers its own tracer provider by default, so you have to disable AgentCore’s built-in OTel (disable_otel=True at configure time and DISABLE_ADOT_OBSERVABILITY=true in the launch env) and explicitly call trace.set_tracer_provider(...) to install the Arize-bound provider.
Run Bedrock AgentCore
Configure and launch the AgentCore deployment. The toolkit auto-creates the ECR repo and the execution role on first run.Expected output
Verify in Arize AX
- Open your Arize AX space and select project
bedrock-agentcore-tracing-example. - You should see a new trace within ~30–60 seconds (AgentCore cold-start adds latency on first invocation) with the same shape as the Strands tracing guide: an
invoke_agent <agentName>root span (AGENT) wrapping anexecute_event_loop_cycle(CHAIN) and achatLLM span (modelus.anthropic.claude-sonnet-4-6). - If no traces appear, see Troubleshooting.
Troubleshooting
Overriding of current TracerProvider is not allowed. AgentCore registers its own TracerProvider on startup, and OTel disallows overriding a registered provider. Passdisable_otel=Truetoagentcore_runtime.configure(...)and setDISABLE_ADOT_OBSERVABILITY=trueinenv_varsatlaunch(...). With both set, yourtrace.set_tracer_provider(provider)instrands_claude.pysucceeds and your Arize-bound provider becomes the global tracer.Failed to export traces to https://otlp.arize.com:443 (StatusCode.PERMISSION_DENIED). The OTLP env vars aren’t reaching the deployed container, or the headers aren’t formatted correctly. AgentCore’s container-side environment expectsOTEL_EXPORTER_OTLP_HEADERSas a comma-separatedkey=valuestring (no quoting, no JSON). Confirm the env vars in thelaunch(...)call match the format in the Run section above.AccessDeniedExceptionfrom Bedrock at agent invocation time. AgentCore’s auto-created execution role gets a default trust policy + minimal permissions; you may need to attachbedrock:InvokeModelandbedrock:InvokeModelWithResponseStreamto it manually. The role ARN is in theagentcore_runtime.configure(...)return value.- Cold-start spans missing. The first
invokeafterlaunchcan take 60–120s while the container provisions. Spans for that invocation will land once the agent’s BatchSpanProcessor flushes, which doesn’t happen until the next invocation. To force a flush on a long-lived runtime, callprovider.force_flush()at the end of your@app.entrypointfunction. - Region mismatch.
BedrockModel(region_name=...)and the AgentCore Runtime region must agree, or the model call fails withValidationException. Set both fromAWS_DEFAULT_REGIONor pass the same string to both.