Use this file to discover all available pages before exploring further.
Strands Agents is an open-source SDK from AWS for building AI agents. It provides a simple, code-first approach to creating agents with tools, multi-agent orchestration, and streaming support.HoneyHive integrates seamlessly with Strands—no instrumentor needed. Strands uses OpenTelemetry natively, so initializing HoneyHive first automatically captures all agent activity.
Zero configuration required. Just initialize HoneyHive before importing Strands and all agent runs, tool calls, and model calls are automatically traced.
To see where to initialize the tracer for your environment, including AWS Lambda and long-running servers, see Tracer Initialization.
pip install "honeyhive[aws-strands]"# Or install separatelypip install honeyhive strands-agents
HoneyHive’s Strands integration is tested against the following versions, as of April 2026. Newer patch releases are generally safe; pin to these versions to reproduce a known-good configuration.
Package
Version
strands-agents
1.37.0 (minimum: >= 1.19.0)
boto3
1.42.97 (minimum: >= 1.26.0, for AWS Bedrock model backend)
Requires Python 3.11+.
Don’t use openinference-instrumentation-strands-agents. That SpanProcessor (from Arize/OpenInference) isn’t supported by HoneyHive’s ingestion pipeline today and its spans may be dropped or misrouted. Strands’ native OTel path shown above is the recommended and fully supported approach.
Strands works with AWS Bedrock via BedrockModel. You’ll need AWS credentials configured via any standard mechanism — ~/.aws/credentials, an IAM role, or environment variables:
import osfrom honeyhive import HoneyHiveTracerfrom strands import Agent, toolfrom strands.models import BedrockModel# Initialize HoneyHive firsttracer = HoneyHiveTracer.init( api_key=os.getenv("HH_API_KEY"), project=os.getenv("HH_PROJECT"),)@tooldef calculator(operation: str, a: float, b: float) -> float: """Perform basic math operations.""" if operation == "add": return a + b elif operation == "multiply": return a * b return 0# Use a cross-region inference profile for on-demand throughputmodel = BedrockModel(model_id="us.anthropic.claude-haiku-4-5-20251001-v1:0")agent = Agent(model=model, tools=[calculator])result = agent("What is 5 + 3?")print(result)
Use a cross-region inference profile (the us. or global. prefix on the model ID) rather than a bare model ID. Bedrock requires the prefixed form for on-demand requests to Claude models — without it, Strands will fail with a model-access error. Example: us.anthropic.claude-haiku-4-5-20251001-v1:0.