Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.honeyhive.ai/llms.txt

Use this file to discover all available pages before exploring further.

AWS Bedrock provides access to foundation models from Anthropic, Meta, Mistral, Amazon, and more. HoneyHive integrates with Bedrock via the OpenInference instrumentor, automatically capturing all model invocations and Converse API calls.

Quick Start

Add HoneyHive tracing in just 4 lines of code. Add this to your existing Bedrock app and all model invocations, Converse API calls, and streaming are automatically traced.
To see where to initialize the tracer for your environment, including AWS Lambda and long-running servers, see Tracer Initialization.
pip install "honeyhive[openinference-aws-bedrock]"

# Or install separately
pip install honeyhive "openinference-instrumentation-bedrock>=0.1.0" "boto3>=1.26.0"
import os
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.bedrock import BedrockInstrumentor

tracer = HoneyHiveTracer.init(
    api_key=os.getenv("HH_API_KEY"),
    project=os.getenv("HH_PROJECT"),
)
BedrockInstrumentor().instrument(tracer_provider=tracer.provider)

# Your existing Bedrock code works unchanged

Tested Versions

HoneyHive’s Bedrock integration is tested against the following versions on PyPI, as of April 2026. Newer patch releases are generally safe; if you hit an issue, pin to these versions to reproduce a known-good configuration.
PackageVersion
openinference-instrumentation-bedrock0.1.33 (minimum: >= 0.1.0)
boto3>= 1.26.0
Requires Python 3.11+.

What Gets Traced

The instrumentor automatically captures:
  • Model invocations - bedrock.invoke_model() with inputs and outputs
  • Converse API - bedrock.converse() with messages, tool use, and token usage
  • Streaming - bedrock.invoke_model_with_response_stream() and converse_stream()
No manual instrumentation required.

Example: Converse API

Use cross-region inference profiles for newer Anthropic models in us-east-1 and us-west-2. Prefix the model ID with us. (e.g. us.anthropic.claude-haiku-4-5-20251001-v1:0) to enable on-demand throughput via cross-region routing. Older model IDs without the prefix may trigger a ResourceNotFoundException — see Troubleshooting below.
import os
import boto3
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.bedrock import BedrockInstrumentor

tracer = HoneyHiveTracer.init(project=os.getenv("HH_PROJECT"))
BedrockInstrumentor().instrument(tracer_provider=tracer.provider)

bedrock = boto3.client("bedrock-runtime", region_name="us-west-2")

# Cross-region inference profile (recommended for on-demand throughput)
model_id = "us.anthropic.claude-haiku-4-5-20251001-v1:0"

response = bedrock.converse(
    modelId=model_id,
    messages=[
        {"role": "user", "content": [{"text": "What is the capital of France?"}]}
    ],
    inferenceConfig={"maxTokens": 100},
)
print(response["output"]["message"]["content"][0]["text"])

Example: Multi-Turn Conversation

import os
import boto3
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.bedrock import BedrockInstrumentor

tracer = HoneyHiveTracer.init(project=os.getenv("HH_PROJECT"))
BedrockInstrumentor().instrument(tracer_provider=tracer.provider)

bedrock = boto3.client("bedrock-runtime", region_name="us-west-2")
model_id = "us.anthropic.claude-haiku-4-5-20251001-v1:0"

response = bedrock.converse(
    modelId=model_id,
    messages=[
        {"role": "user", "content": [{"text": "My name is Alice."}]},
        {"role": "assistant", "content": [{"text": "Hello, Alice!"}]},
        {"role": "user", "content": [{"text": "What is my name?"}]},
    ],
    inferenceConfig={"maxTokens": 50},
)
print(response["output"]["message"]["content"][0]["text"])
# → "Your name is Alice."

Environment Configuration

# HoneyHive configuration
export HH_API_KEY="your-honeyhive-api-key"
export HH_PROJECT="your-project"

# AWS configuration
export AWS_ACCESS_KEY_ID="your-aws-access-key"
export AWS_SECRET_ACCESS_KEY="your-aws-secret-key"
export AWS_DEFAULT_REGION="us-west-2"

# Or use AWS profiles
export AWS_PROFILE="your-profile"

Troubleshooting

Traces not appearing

  1. Check environment variables - Ensure HH_API_KEY and HH_PROJECT are set
  2. Pass the tracer provider - The instrumentor must receive tracer_provider=tracer.provider:
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.bedrock import BedrockInstrumentor

tracer = HoneyHiveTracer.init(project="your-project")

# ✅ Correct - pass tracer_provider
BedrockInstrumentor().instrument(tracer_provider=tracer.provider)

# ❌ Wrong - missing tracer_provider
BedrockInstrumentor().instrument()
  1. Initialize before creating the client - Call instrument() before creating the boto3 Bedrock client

ResourceNotFoundException: Legacy model

If you see ResourceNotFoundException: Access denied. This Model is marked by provider as Legacy, the model has been deprecated by its provider on Bedrock. Switch to a supported model using a cross-region inference profile:
# ❌ Direct model ID - may fail with legacy error
model_id = "anthropic.claude-3-haiku-20240307-v1:0"

# ✅ Cross-region inference profile - recommended
model_id = "us.anthropic.claude-haiku-4-5-20251001-v1:0"

Region configuration

Bedrock is available in specific AWS regions. Make sure your region has Bedrock enabled and the model you’re using is available there:
bedrock = boto3.client(
    "bedrock-runtime",
    region_name="us-west-2",  # Ensure this region has Bedrock + your model
)

Anthropic Integration

Direct Anthropic API integration (without Bedrock)

AWS Strands Integration

Agent framework built on Bedrock

Enrich Your Traces

Add user IDs and custom metadata to traces

Distributed Tracing

Trace calls across service boundaries

Using Traceloop (OpenLLMetry) Instead

If your project already uses Traceloop / OpenLLMetry, you can use its Bedrock instrumentor instead of OpenInference. The setup is identical - only the install and import paths differ.
pip install "honeyhive[traceloop-aws-bedrock]"
from honeyhive import HoneyHiveTracer
from opentelemetry.instrumentation.bedrock import BedrockInstrumentor

tracer = HoneyHiveTracer.init(project="your-project")
BedrockInstrumentor().instrument(tracer_provider=tracer.provider)
Tested version: opentelemetry-instrumentation-bedrock 0.59.0 (April 2026).

Resources