Amazon Bedrock is AWS’s managed foundation-model service — Claude, Llama, Mistral, Titan, and others are reachable through a singleDocumentation Index
Fetch the complete documentation index at: https://arize-ax.mintlify.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
boto3 client. Arize AX captures every Bedrock model call (invoke_model, converse, converse_stream) via the openinference-instrumentation-bedrock package.
Bedrock Tracing Tutorial (Google Colab)
Prerequisites
- Python 3.10+
- An Arize AX account (sign up)
- An AWS account with Bedrock model access enabled for the model you want to call (the example below uses Anthropic Claude Sonnet 4.6 — request access from the Bedrock console under Model access if you haven’t already)
Launch Arize AX
- Sign in to your Arize AX account.
- From Space Settings, copy your Space ID and API Key. You will set them as
ARIZE_SPACE_IDandARIZE_API_KEYbelow.
Install
Configure credentials
Setup tracing
Run Amazon Bedrock
Expected output
Verify in Arize AX
- Open your Arize AX space and select project
amazon-bedrock-tracing-example. - You should see a new trace within ~30 seconds containing a
bedrock.converseLLM span with the prompt, response, and token usage attached. The span’sllm.model_nameis the model id you called (e.g.us.anthropic.claude-sonnet-4-6). - If no traces appear, see Troubleshooting.
Troubleshooting
- No traces in Arize AX. Confirm
ARIZE_SPACE_IDandARIZE_API_KEYare set in the same shell that runsexample.py. Enable OpenTelemetry debug logs withexport OTEL_LOG_LEVEL=debugand re-run. - Bedrock spans missing but other spans present.
BedrockInstrumentor().instrument(...)must run beforeboto3.client("bedrock-runtime", ...)is called. Make sureinstrumentation.pyis the first import in your entry point —boto3clients created before instrumentation aren’t patched. AccessDeniedException/Could not assume role. Your IAM principal doesn’t havebedrock:InvokeModelpermission, or model access isn’t enabled for the model id in the example. Enable access in the Bedrock console under Model access and confirm your IAM policy grantsbedrock:InvokeModelonarn:aws:bedrock:*::foundation-model/*.ValidationException: Invocation of model ID anthropic.claude-sonnet-4-6 ... isn't supported. Some Claude models on Bedrock are only available through cross-region inference profiles. Prefix the model id with your geography slug —us.anthropic.claude-sonnet-4-6(the example uses this) oreu.anthropic.claude-sonnet-4-6.ExpiredTokenException. YourAWS_SESSION_TOKEN(SSO / STS temporary credentials) has expired. Re-run the SSO login and re-export the new triple ofAWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY,AWS_SESSION_TOKEN.- Meta Llama spans missing with
invoke_model. The instrumentor doesn’t currently capture Llama responses via theinvoke_modelAPI — useconverse(which the example above already does) for any non-Anthropic model.