Evaluators
Human annotation
Technical documentation for creating custom human evaluator fields in HoneyHive
Human annotation allows for manual review and evaluation of AI outputs by human reviewers.
Creating a Human Evaluator
- Navigate to the Evaluators tab in the HoneyHive console.
- Click
Add Evaluator
and selectHuman Evaluator
.
Evaluation Criteria
Define clear evaluation criteria for annotators:
Configuration
Return Type
Options:
Numeric
: For ratings on a scaleBinary
: For yes/no evaluationsNotes
: For free-form text feedback
Rating Scale
For numeric return types, specify the scale (e.g., 1-5).
Passing Range
Define the range of scores considered acceptable.
Annotation in UI
You can invite domain experts to annotate traces in any experiment. Once in, experts can annotate each trace and quickly navigate across events using keyboard shortcuts (⬆️ and ⬇️).