Skip to main content
Anthropic provides Claude models for chat, tool use, and vision. HoneyHive integrates with Anthropic via the OpenInference instrumentor, automatically capturing all API calls, tool use, and token usage.

Quick Start

Add HoneyHive tracing in just 4 lines of code. Add this to your existing Anthropic app and all message calls, tool use, and streaming are automatically traced.
To see where to initialize the tracer for your environment, including AWS Lambda and long-running servers, see Tracer Initialization.
pip install "honeyhive[openinference-anthropic]>=1.0.0rc0"

# Or install separately
pip install "honeyhive>=1.0.0rc0" openinference-instrumentation-anthropic anthropic
import os
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.anthropic import AnthropicInstrumentor

tracer = HoneyHiveTracer.init(
    api_key=os.getenv("HH_API_KEY"),
    project=os.getenv("HH_PROJECT"),
)
AnthropicInstrumentor().instrument(tracer_provider=tracer.provider)

# Your existing Anthropic code works unchanged

What Gets Traced

The instrumentor automatically captures:
  • Message completions - client.messages.create() with inputs, outputs, and token usage
  • Tool use - Each tool call with arguments and results
  • Streaming responses - Streamed messages with aggregated tokens
No manual instrumentation required.

Example: Message Creation

import anthropic
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.anthropic import AnthropicInstrumentor

tracer = HoneyHiveTracer.init(project="your-project")
AnthropicInstrumentor().instrument(tracer_provider=tracer.provider)

client = anthropic.Anthropic()

response = client.messages.create(
    model="claude-haiku-4-5-20251001",
    max_tokens=100,
    messages=[
        {"role": "user", "content": "What is the capital of France?"},
    ],
)
print(response.content[0].text)

# A follow-up call - also traced
response2 = client.messages.create(
    model="claude-haiku-4-5-20251001",
    max_tokens=100,
    messages=[
        {"role": "user", "content": "Tell me a fun fact about Paris."},
    ],
)
print(response2.content[0].text)

Environment Configuration

# HoneyHive configuration
export HH_API_KEY="your-honeyhive-api-key"
export HH_PROJECT="your-project"

# Anthropic configuration
export ANTHROPIC_API_KEY="your-anthropic-api-key"

Troubleshooting

Traces not appearing

  1. Check environment variables - Ensure HH_API_KEY and HH_PROJECT are set
  2. Pass the tracer provider - The instrumentor must receive tracer_provider=tracer.provider:
from honeyhive import HoneyHiveTracer
from openinference.instrumentation.anthropic import AnthropicInstrumentor

tracer = HoneyHiveTracer.init(project="your-project")

# ✅ Correct - pass tracer_provider
AnthropicInstrumentor().instrument(tracer_provider=tracer.provider)

# ❌ Wrong - missing tracer_provider
AnthropicInstrumentor().instrument()
  1. Initialize before making calls - Call instrument() before creating the Anthropic client

Enrich Your Traces

Add user IDs and custom metadata to traces

Custom Spans

Create spans for business logic around API calls

Distributed Tracing

Trace calls across service boundaries

Resources