Skip to main content
AWS Bedrock provides access to foundation models from Anthropic, Meta, Mistral, Amazon, and more. HoneyHive integrates with Bedrock via the OpenInference instrumentor, automatically capturing all model invocations and Converse API calls.

Quick Start

Add HoneyHive tracing in just 4 lines of code. Add this to your existing Bedrock app and all model invocations, Converse API calls, and streaming are automatically traced.
To see where to initialize the tracer for your environment, including AWS Lambda and long-running servers, see Tracer Initialization.
pip install "honeyhive[openinference-bedrock]>=1.0.0rc0"

# Or install separately
pip install "honeyhive>=1.0.0rc0" openinference-instrumentation-bedrock boto3
import os
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.bedrock import BedrockInstrumentor

tracer = HoneyHiveTracer.init(
    api_key=os.getenv("HH_API_KEY"),
    project=os.getenv("HH_PROJECT"),
)
BedrockInstrumentor().instrument(tracer_provider=tracer.provider)

# Your existing Bedrock code works unchanged

What Gets Traced

The instrumentor automatically captures:
  • Model invocations - bedrock.invoke_model() with inputs and outputs
  • Converse API - bedrock.converse() with messages, tool use, and token usage
  • Streaming - bedrock.invoke_model_with_response_stream() and converse_stream()
No manual instrumentation required.

Example: Converse API

import boto3
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.bedrock import BedrockInstrumentor

tracer = HoneyHiveTracer.init(project="your-project")
BedrockInstrumentor().instrument(tracer_provider=tracer.provider)

bedrock = boto3.client("bedrock-runtime", region_name="us-east-1")
model_id = "anthropic.claude-3-5-sonnet-20241022-v2:0"

response = bedrock.converse(
    modelId=model_id,
    messages=[
        {"role": "user", "content": [{"text": "What is the capital of France?"}]}
    ],
    inferenceConfig={"maxTokens": 100},
)
print(response["output"]["message"]["content"][0]["text"])

# A follow-up call - also traced
response2 = bedrock.converse(
    modelId=model_id,
    messages=[
        {"role": "user", "content": [{"text": "Tell me a fun fact about Paris."}]}
    ],
    inferenceConfig={"maxTokens": 100},
)
print(response2["output"]["message"]["content"][0]["text"])

Environment Configuration

# HoneyHive configuration
export HH_API_KEY="your-honeyhive-api-key"
export HH_PROJECT="your-project"

# AWS configuration
export AWS_ACCESS_KEY_ID="your-aws-access-key"
export AWS_SECRET_ACCESS_KEY="your-aws-secret-key"
export AWS_DEFAULT_REGION="us-east-1"

# Or use AWS profiles
export AWS_PROFILE="your-profile"

Troubleshooting

Traces not appearing

  1. Check environment variables - Ensure HH_API_KEY and HH_PROJECT are set
  2. Pass the tracer provider - The instrumentor must receive tracer_provider=tracer.provider:
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.bedrock import BedrockInstrumentor

tracer = HoneyHiveTracer.init(project="your-project")

# ✅ Correct - pass tracer_provider
BedrockInstrumentor().instrument(tracer_provider=tracer.provider)

# ❌ Wrong - missing tracer_provider
BedrockInstrumentor().instrument()
  1. Initialize before making calls - Call instrument() before creating the Bedrock client

Region configuration

Bedrock is available in specific AWS regions. Make sure your region has Bedrock enabled:
bedrock = boto3.client(
    "bedrock-runtime",
    region_name="us-east-1",  # Ensure this region has Bedrock
)

Anthropic Integration

Direct Anthropic API integration

Enrich Your Traces

Add user IDs and custom metadata to traces

Distributed Tracing

Trace calls across service boundaries

Resources