In this guide, you’ll learn how to make the most of the HoneyHive Playground, where you can experiment with new prompts, models, OpenAI functions and external tools.

HoneyHive allows you to define, version and manage your prompt templates and model configurations within each project. To manage your configs, navigate to the Prompts tab in the left sidebar.

A prompt-model configuration refers to a combination of prompt, model and hyperparameter settings unique to a particular version. Throughout our docs, we may use the term “config” or “prompt configuration” to refer to a prompt-model configuration.
  1. Access the Playground from the Studio tab in the left sidebar.


Creating a Prompt Template

  1. Add a version name for your prompt.
  2. Choose your application type (Chat or Completions), model provider and hyperparameter settings. We currently offer OpenAI & Anthropic models natively via the Playground, but you can still use your own models to run evaluations and log requests programmatically via the SDK.
Only OpenAI’s GPT 3.5 Turbo and GPT-4 models support the chat completions format.
  1. Insert dynamic input variables in your prompt template using curly brackets {{ and }}. You’ll be using these variables when logging requests and running evaluations with HoneyHive.
Dynamic variables are typically useful when inserting inputs from end-users or external context from tools such as vector databases.
  1. Click Save to save your changes.


Version Management

  1. HoneyHive automatically versions your prompts as you edit your prompt template and test new scenarios.
  2. A new version is only created automatically when you run a test case against your edited prompt.
While HoneyHive automatically creates new versions as you iterate, you will need to give your version a name and click Save in order to save it as a prompt-model configuration.

Sharing and Collaboration

You can quickly share prompt templates with your colleagues via these two methods:-

  1. Share via Email: Opt to share the prompt template via email for direct communication and easy reference.
  2. Share Link: Generate a shareable link that allows others to access your saved prompt template. Colleagues can quickly fork your prompt and iterate independently.

Using OpenAI Functions

  1. Navigate to Tools in the left sidebar.
  2. Click Add Tool and select OpenAI functions.
  3. Define your OpenAI function in a JSON format.


Learn more about OpenAI function schema here.

Integrating Pinecone and SerpAPI

  1. Navigate to Tools in the left sidebar.
  2. Click Add Tool and select External Tool.
  3. Choose between SerpAPI and Pinecone in the dropdowns.
  4. Add your API keys and other parameters specific to your Pinecone index.


Using External Tools in the Playground

  1. You can access the Playground within the Prompts tab in the left sidebar.
  2. To use an external tool in your prompt template, copy the tool you’d like to select.
    We use /ToolName{{query_name}} as the convention to call a tool.
  3. Paste it in your prompt template and start using.


What’s next

Now that you’ve defined some prompt configurations in the Playground, learn more about how to evaluate and monitor different prompt configurations using HoneyHive.