A prompt-model configuration refers to a combination of prompt, model and hyperparameter settings unique to a particular version. Throughout our docs, we may use the term “config” or “prompt configuration” to refer to a prompt-model configuration.
What is the Playground?
The Playground is a UI that connects with your LLMs wherever they are hosted & allows you to quickly iterate on prompts built on top of them. The way it calls your LLM provider is- We ask you to configure your provider secrets (that are encrypted & stored in your browser cache)
- Based on the parameters & prompt specified in the UI, we craft an API request for your provider
- We pass the secrets & the request to our proxy service which pings your provider
We trace cost, latency & calculate evaluators automatically on all requests from our proxy.
- If the request was successful, we stream or print the response in the UI
- If the request was unsuccessful, we show a full error description provided by the provider
Configure a model provider
Expected Time: Few minutes Steps Next Steps Congratulations, now you are ready to create prompts on top of your models in HoneyHive.Create your first prompt
Expected time: Few minutes In the following tutorial, we useAI Q&A bot
as the project, you can pick any project you want to create your prompt in instead.
HoneyHive uses
{{
and }}
to denote a dynamic insertion field for a prompt. Dynamic variables are typically useful when inserting inputs from end-users or external context from tools such as vector databases.Version Management
Our first prompts are often simple prototypes that we end up changing very often.- HoneyHive automatically versions your prompts as you edit your prompt template and test new scenarios.
- A new version is only created automatically when you run a test case against your edited prompt.
While HoneyHive automatically creates new versions as you iterate, you will need to give your version a name and click
Save
in order to save it as a prompt-model configuration.Iterating on a saved prompt
Our Playground support easy forking & saving to track variants you like while you keep changing the prompt. Expected time: few minutes StepsOpen a prompt from a previous run
If you want to go back to a prompt you had already run, or open one from a trace that was logged externally, then you can simply click “Open In Playground” from that run’s view. Expected time: few minutes StepsSharing and Collaboration
To share a saved prompt, simply press theShare
button on the top right of the Playground.
This will copy a link to the saved prompt that you can share with your teammates.
Using OpenAI Functions
- Navigate to Tools in the left sidebar.
- Click
Add Tool
and selectOpenAI functions
. - Define your OpenAI function in a JSON format.

Integrating Pinecone and SerpAPI
- Navigate to Tools in the left sidebar.
- Click
Add Tool
and selectExternal Tool
. - Choose between SerpAPI and Pinecone in the dropdowns.
- Add your API keys and other parameters specific to your Pinecone index.

Using External Tools in the Playground
- You can access the Playground within the Prompts tab in the left sidebar.
- To use an external tool in your prompt template, copy the tool you’d like to select.
We use
/ToolName{{query_name}}
as the convention to call a tool. - Paste it in your prompt template and start using.