BYOI Advantage: HoneyHive has zero dependencies on the Anthropic SDK. Upgrade to the latest anthropic version the day it ships—no waiting for SDK compatibility updates.
Compatibility
Python Version Support
| Support Level | Python Versions |
|---|
| Fully Supported | 3.11, 3.12, 3.13 |
| Not Supported | 3.10 and below |
Anthropic SDK Requirements
- Minimum: anthropic >= 0.17.0
- Recommended: anthropic >= 0.84.0
Known Limitations
- Streaming: Partial support - requires manual context management
- Vision API: Supported for Claude 3 models, traced automatically
- Tool Use: Fully supported with both instrumentors
- Message Batching: Not yet supported by instrumentors, use manual tracing
Choose Your Instrumentor
HoneyHive supports two instrumentor options for Anthropic:
| Instrumentor | Best For | Install |
|---|
| OpenInference | Open-source, lightweight, getting started | pip install "honeyhive[openinference-anthropic]>=1.0.0rc0" |
| Traceloop | Production, cost tracking, enhanced metrics | pip install "honeyhive[traceloop-anthropic]>=1.0.0rc0" |
Quick Start with OpenInference
Installation
# Recommended: Install with Anthropic integration
pip install "honeyhive[openinference-anthropic]>=1.0.0rc0"
# Alternative: Manual installation
pip install "honeyhive>=1.0.0rc0" openinference-instrumentation-anthropic anthropic>=0.17.0
Basic Setup
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.anthropic import AnthropicInstrumentor
import anthropic
# Step 1: Initialize HoneyHive tracer first
tracer = HoneyHiveTracer.init(
project="your-project" # Or set HH_PROJECT environment variable
) # Uses HH_API_KEY from environment
# Step 2: Initialize instrumentor with tracer_provider
instrumentor = AnthropicInstrumentor()
instrumentor.instrument(tracer_provider=tracer.provider)
# Now all Anthropic calls are automatically traced!
client = anthropic.Anthropic() # Uses ANTHROPIC_API_KEY automatically
response = client.messages.create(
model="claude-3-sonnet-20240229",
max_tokens=1000,
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.content[0].text)
# ✨ Automatically traced!
Order matters! The tracer must be initialized BEFORE calling instrumentor.instrument().
Quick Start with Traceloop
Installation
# Recommended: Install with Traceloop Anthropic integration
pip install "honeyhive[traceloop-anthropic]>=1.0.0rc0"
# Alternative: Manual installation
pip install "honeyhive>=1.0.0rc0" opentelemetry-instrumentation-anthropic anthropic>=0.17.0
Basic Setup
from honeyhive import HoneyHiveTracer
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
import anthropic
# Step 1: Initialize HoneyHive tracer first
tracer = HoneyHiveTracer.init(
project="your-project"
)
# Step 2: Initialize Traceloop instrumentor with tracer_provider
instrumentor = AnthropicInstrumentor()
instrumentor.instrument(tracer_provider=tracer.provider)
# All Anthropic calls traced with enhanced metrics!
client = anthropic.Anthropic()
response = client.messages.create(
model="claude-3-sonnet-20240229",
max_tokens=1000,
messages=[{"role": "user", "content": "Hello!"}]
)
Instrumentor Comparison
| Feature | OpenInference | Traceloop |
|---|
| Setup Complexity | Simple | Simple |
| Token Tracking | Basic span attributes | Detailed metrics + costs |
| Model Metrics | Model name, timing | Cost per model, latency |
| Performance | Lightweight, fast | Optimized with batching |
| Cost Analysis | Manual calculation | Automatic per request |
| Best For | Simple integrations, dev | Production, cost optimization |
Advanced Usage with @trace Decorator
Combine instrumentors with the @trace decorator for explicit control:
from honeyhive import HoneyHiveTracer, trace, enrich_span
from openinference.instrumentation.anthropic import AnthropicInstrumentor
import anthropic
# Initialize tracer and instrumentor
tracer = HoneyHiveTracer.init(
api_key="your-honeyhive-key",
project="your-project",
source="production"
)
instrumentor = AnthropicInstrumentor()
instrumentor.instrument(tracer_provider=tracer.provider)
@trace
def analyze_document(document: str) -> dict:
"""Analyze document with multiple Claude calls."""
client = anthropic.Anthropic()
# Add business context
enrich_span({
"use_case": "document_analysis",
"doc_length": len(document)
})
# Quick summary with Sonnet
summary = client.messages.create(
model="claude-3-sonnet-20240229",
max_tokens=500,
messages=[{
"role": "user",
"content": f"Summarize this document: {document}"
}]
)
# Detailed analysis with Opus
analysis = client.messages.create(
model="claude-3-opus-20240229",
max_tokens=1000,
messages=[{
"role": "user",
"content": f"Provide detailed analysis: {document}"
}]
)
enrich_span({
"models_used": ["claude-3-sonnet", "claude-3-opus"],
"status": "success"
})
return {
"summary": summary.content[0].text,
"analysis": analysis.content[0].text
}
Multiple Instrumentors
Use multiple instrumentors for different providers:
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.anthropic import AnthropicInstrumentor
from openinference.instrumentation.openai import OpenAIInstrumentor
# Step 1: Initialize tracer
tracer = HoneyHiveTracer.init(project="multi-provider-app")
# Step 2: Initialize multiple instrumentors
anthropic_instrumentor = AnthropicInstrumentor()
openai_instrumentor = OpenAIInstrumentor()
anthropic_instrumentor.instrument(tracer_provider=tracer.provider)
openai_instrumentor.instrument(tracer_provider=tracer.provider)
# Both Anthropic and OpenAI calls are now traced!
Environment Configuration
# HoneyHive configuration
export HH_API_KEY="your-honeyhive-api-key"
export HH_PROJECT="your-project"
export HH_SOURCE="production"
# Anthropic configuration
export ANTHROPIC_API_KEY="your-anthropic-api-key"
What Gets Traced
With instrumentors initialized, these Anthropic calls are automatically traced:
client.messages.create() - Message completions
- Streaming responses
- Tool use / function calling
- Vision API calls (Claude 3)
Captured data includes:
- Model name and parameters
- Input messages
- Output responses
- Token usage (input, output, total)
- Latency metrics
- Errors and exceptions
from honeyhive import HoneyHiveTracer, trace
from openinference.instrumentation.anthropic import AnthropicInstrumentor
import anthropic
import json
tracer = HoneyHiveTracer.init(project="tool-use-demo")
instrumentor = AnthropicInstrumentor()
instrumentor.instrument(tracer_provider=tracer.provider)
tools = [
{
"name": "get_weather",
"description": "Get weather for a location",
"input_schema": {
"type": "object",
"properties": {
"location": {"type": "string"}
},
"required": ["location"]
}
}
]
@trace
def weather_assistant(query: str):
"""Assistant with weather tool use."""
client = anthropic.Anthropic()
response = client.messages.create(
model="claude-3-sonnet-20240229",
max_tokens=1024,
tools=tools,
messages=[{"role": "user", "content": query}]
)
# Handle tool use
if response.stop_reason == "tool_use":
tool_block = next(
block for block in response.content
if block.type == "tool_use"
)
# Simulate weather lookup
weather_result = {"temp": "72°F", "conditions": "Sunny"}
# Continue with tool result
response = client.messages.create(
model="claude-3-sonnet-20240229",
max_tokens=1024,
messages=[
{"role": "user", "content": query},
{"role": "assistant", "content": response.content},
{
"role": "user",
"content": [{
"type": "tool_result",
"tool_use_id": tool_block.id,
"content": json.dumps(weather_result)
}]
}
]
)
return response.content[0].text
# All calls traced automatically!
result = weather_assistant("What's the weather in Paris?")
Example: Multi-turn Conversation
from honeyhive import HoneyHiveTracer, trace
from openinference.instrumentation.anthropic import AnthropicInstrumentor
import anthropic
tracer = HoneyHiveTracer.init(project="conversation-demo")
instrumentor = AnthropicInstrumentor()
instrumentor.instrument(tracer_provider=tracer.provider)
class Conversation:
def __init__(self):
self.client = anthropic.Anthropic()
self.messages = []
@trace
def chat(self, user_message: str) -> str:
"""Add message and get response."""
self.messages.append({"role": "user", "content": user_message})
response = self.client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=self.messages
)
assistant_message = response.content[0].text
self.messages.append({"role": "assistant", "content": assistant_message})
return assistant_message
# Usage
conv = Conversation()
print(conv.chat("Hello, Claude!"))
print(conv.chat("What can you tell me about AI?"))
print(conv.chat("Give me an example."))
Troubleshooting
Missing Traces
Ensure correct initialization order:
# ✅ Correct
tracer = HoneyHiveTracer.init(project="my-project")
instrumentor = AnthropicInstrumentor()
instrumentor.instrument(tracer_provider=tracer.provider)
# ❌ Wrong - instrumentor before tracer
instrumentor = AnthropicInstrumentor()
instrumentor.instrument() # No tracer_provider!
tracer = HoneyHiveTracer.init(project="my-project")
Import Errors
# For OpenInference
pip install "honeyhive[openinference-anthropic]>=1.0.0rc0"
# For Traceloop
pip install "honeyhive[traceloop-anthropic]>=1.0.0rc0"
Migration Between Instrumentors
From OpenInference to Traceloop:
# Before (OpenInference)
from openinference.instrumentation.anthropic import AnthropicInstrumentor
# After (Traceloop) - just change the import
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
# Rest of the code stays the same!