Add HoneyHive observability to your Anthropic Claude applications
Anthropic provides Claude models for chat, tool use, and vision. HoneyHive integrates with Anthropic via the OpenInference instrumentor, automatically capturing all API calls, tool use, and token usage.
Add HoneyHive tracing in just 4 lines of code. Add this to your existing Anthropic app and all message calls, tool use, and streaming are automatically traced.
To see where to initialize the tracer for your environment, including AWS Lambda and long-running servers, see Tracer Initialization.
pip install "honeyhive[openinference-anthropic]>=1.0.0rc0"# Or install separatelypip install "honeyhive>=1.0.0rc0" openinference-instrumentation-anthropic anthropic
import osfrom honeyhive import HoneyHiveTracerfrom openinference.instrumentation.anthropic import AnthropicInstrumentortracer = HoneyHiveTracer.init( api_key=os.getenv("HH_API_KEY"), project=os.getenv("HH_PROJECT"),)AnthropicInstrumentor().instrument(tracer_provider=tracer.provider)# Your existing Anthropic code works unchanged
import anthropicfrom honeyhive import HoneyHiveTracerfrom openinference.instrumentation.anthropic import AnthropicInstrumentortracer = HoneyHiveTracer.init(project="your-project")AnthropicInstrumentor().instrument(tracer_provider=tracer.provider)client = anthropic.Anthropic()response = client.messages.create( model="claude-haiku-4-5-20251001", max_tokens=100, messages=[ {"role": "user", "content": "What is the capital of France?"}, ],)print(response.content[0].text)# A follow-up call - also tracedresponse2 = client.messages.create( model="claude-haiku-4-5-20251001", max_tokens=100, messages=[ {"role": "user", "content": "Tell me a fun fact about Paris."}, ],)print(response2.content[0].text)