Your bartender is an AI: integrating Chat GPT with a mixology app

Gareth Cronin
5 min readJul 23, 2023

--

Not a day goes by without a tech article discussing different ways to use Open AI’s GPT implementation to bring the magic of “the stochastic parrot” into different domains. I thought I’d start my own adventure by adding LLM integration to my amateur bartender cocktail tool: Bar Tool.

Bar Tool in action

I like to make tiny SaaS apps to solve my own problems, and in the hope that it helps other people with the same problem. A couple of years ago, after spending a couple of weeks ordering everything on the cocktail menu at a resort in the Philippines, I wanted to recapture the cocktail magic at home. Like a number of my obsessions, this led to working through a list — in this case the International Bartenders Association 77 official cocktails. Since I was going to the trouble of buying ingredients and making the drinks, I figured I would post them to Instagram as well. A couple of years later I have a well-stocked bar, and have posted over 200 cocktails. I’ve written about the tech stack I use before, but now let’s add the AI magic.

It’s a bit too sweet…

I’d already had a go at manually creating cocktail recipes using Chat GPT:

Chat GPT behind the bar

But I’ve run out of cognac…

And the classic substitution problem:

Not bad.

Consuming the Open AI API

The Open AI API is straightforward to use. Once you’ve added a credit card for billing, it’s possible to generate API keys. My back end is Node.js, so I added the official Node SDK with a quick npm install openai .

The chat interface is via a “completion” construct. The temperature is a way of injecting some entropy so the results are a bit more exciting:

const { Configuration, OpenAIApi } = require("openai");

//...

const apiKey = process.env.GPT_API_KEY;
const configuration = new Configuration({
apiKey,
});
const openai = new OpenAIApi(configuration);
const chatCompletion = await openai.createChatCompletion({
model: "gpt-3.5-turbo",
messages: [{ role: "user", content: promptTemplate }],
temperature: 0.6
});
const response = chatCompletion.data.choices[0].message.content;

My back end is hosted on Google Cloud Functions, so I put the API key in a Github Actions secret for the action that deploys the function and set it as an environment variable like so:

name: Functions build

on:
push:
branches: [ main ]
paths:
- 'functions/**'
- '.github/workflows/functions.yml'

jobs:
deploy:
name: Deploy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- id: 'auth'
uses: 'google-github-actions/auth@v1'
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}
- name: Deploy data
uses: google-github-actions/deploy-cloud-functions@main
with:
name: data
runtime: nodejs18
source_dir: functions
project_id: bar-tool
env_vars: GPT_API_KEY=${{ secrets.GPT_API_KEY }}

Prompt engineering

My recipes are stored and passed around in a simple JSON schema. Here’s the classic vintage cocktail, “Angel Face”:

{
"name": "Angel Face",
"ingredients": [
{
"ingredient": "Gin",
"amount": 3,
"unit": "cl"
},
{
"ingredient": "Apricot brandy",
"amount": 3,
"unit": "cl"
},
{
"ingredient": "Calvados",
"amount": 3,
"unit": "cl"
}
],
"description": "Pour all ingredients into cocktail shaker filled with ice cubes. Shake and strain into a chilled cocktail glass.",
"tags": [
"IBA Unforgettables",
"Vintage",
"Shake"
],
"image": "/img/standardrecipes/Angel Face.jpg"
}

For the prompt, I’m just dumping the raw user input from my UI into templates using string interpolation. I don’t even need to define a schema for the suggestion, because GPT is well-trained enough to derive it from the provided example recipe. Here’s my variation template for Chat GPT:

const ALLOWED_TYPES = ['sweeter', 'more sour', 'drier', 'fruitier', 'boozier', 'less boozy'];

// ...

const promptTemplate = `Suggest a ${type} version of the cocktail recipe encoded in this JSON object:
${JSON.stringify(recipe)}
The allowed units are 'cl', 'ml', 'tsp', 'bsp', 'tbsp', 'drop', 'dash', 'splash', 'item'
Please provide the JSON code without any intro or preamble text.
If there is an additional explanation, just add it as a string property named "explanation" in
the same chunk of JSON.
`;

I had a bit of trouble convincing Chat GPT to reliably return just the JSON, so I added a crude “find the JSON” method. When the responses come back, I still need to run it through this because it ignores me and adds a preamble every so often:

const findLongestJSON = (text) => {
// find the first opening brace { then keep buffering characters until the last occurrence of a closing brace }
let start = text.indexOf('{');
let end = text.lastIndexOf('}');
console.log('start', start, 'end', end);
return text.substring(start, end + 1);
};

The template for substitution is more complicated. I found that GPT would just throw every ingredient that I said I had in, unless I encouraged it to keep the alcohol content down. No one wants rocket fuel!

Note the line encouraging a bit of imagination for the name. Without it, there’s a lot of “<Original Cocktail Name> Remix” 😬.

const substitution = async (recipe, missing, present) => {
const promptTemplate = `I would like to make this recipe as represented in the JSON format below
${JSON.stringify(recipe)}
I don't have these ingredients: ${missing}
I do have: ${present}
The allowed units are: 'cl', 'ml', 'tsp', 'bsp', 'tbsp', 'drop', 'dash', 'splash', 'item'
Keep the maximum alcohol by volume under three standard drinks
Please provide the JSON code without any intro or preamble text.
If there is an additional explanation, just add it as a string property named "explanation" in the same chunk of JSON.
Feel free to give the recipe a new name that reflects the modifications, especially if you can use a pun or other clever wordplay.
`;

The finished product

I now have an AI bartender with variation and substitution!

Conclusion

If I wanted to provide this kind of functionality a more traditional way I would have burnt a lot of hours munging data into different formats, parsing and validating user input, and messing around with search algorithms. For a cheap and cheerful “it just works” feature, the Open AI GPT API certainly fits the bill.

--

--

Gareth Cronin

Technology leader in Auckland, New Zealand: start-up founder, father of two, maker of t-shirts and small software products