Skip to main content
The Playground lets you create and iterate on prompts without writing code. Use it to:
  • Experiment with prompt templates and model configurations
  • Test prompts against sample inputs before deploying
  • Save working versions for use in your application
HoneyHive Playground interface with provider selection, chat template editor, and response output

Prerequisites

Before using the Playground, configure your model provider API keys in Settings > AI Provider Secrets.
You can configure multiple providers (OpenAI, Anthropic, etc.) and switch between them in the Playground.

Creating a Prompt

  1. Navigate to Studio > Playground in the sidebar
  2. Select a Provider and Model in the left panel
  3. Write your prompt template in the Chat Template section
  4. Use {{variable}} syntax for dynamic inputs (e.g., {{question}})
  5. Add sample values in the Inputs panel
  6. Click Run to test the prompt
Dynamic variables like {{question}} let you insert user inputs or context from your application at runtime.

Saving and Forking

Prompts are saved as configurations - each configuration is a single record that you can update or fork.
ActionWhat Happens
Save (new prompt)Creates a new configuration with your chosen name
Save (existing prompt)Overwrites the existing configuration
ForkCreates a copy, preserving the original
To preserve a working prompt before experimenting, use Fork first. Saving an existing configuration overwrites it.
To save a prompt:
  1. Click Save in the top toolbar
  2. Enter a configuration name (e.g., v1-production)
  3. The saved configuration appears in Studio > Prompts
To create a variant without losing the original:
  1. Click Fork to create a copy
  2. Make your changes
  3. Save the forked version with a new name

Managing Saved Prompts

View all saved prompts in Studio > Prompts:
All Prompts table showing environment badges (prod, staging, dev), version names, models, and prompt templates
From here you can:
  • Deploy a prompt to an environment (dev, staging, prod)
  • Edit a prompt by opening it in the Playground
  • Compare different versions side-by-side

Opening Prompts from Traces

When debugging production issues, you can open any traced LLM call in the Playground:
  1. Go to Log Store and find the trace
  2. Click on a model event
  3. Click Open in Playground in the top right
This loads the exact prompt, model, and parameters from that production call so you can iterate on improvements.

Sharing

To share a prompt with teammates:
  1. Save the prompt first
  2. Click Share in the top right
  3. Copy the link
Anyone on your team with access can view and fork the shared prompt.

Next Steps