Read on to find out how to fallback from gpt4 turbo preview to gpt 4 when using function calling (Relevant November 2023 as gpt4-turbo is in preview and heavily rate limited), and with that response validate and type your function calling responses
The function calling feature is very powerful (we use it all the time to turn unstructured data into structured data) as it allows us to get json data for whatever function we want to call. But its still possible that GPT could hallucinate bad json data, or the wrong type, or a wrong field name...I could go one (just last week the json coming back couldn't be parsed because it was malformed when telling gpt4 to use emojis).
Besides those outliers, we want our response from GPT to be typed for use elsewhere in our application rather than just having it as "any".
So with that context lets get into the code. We're going to use an example where we we're taking in a question from a user asking questions on an e-commerce site as a way to show multiple functions we might call based on the user question.
Real quick install OpenAI:
and instantiate the OpenAI client:
Now we want to setup the ability to fallback from one GPT model to another. As I mentioned before, its currently November 2023 and GPT4 Turbo is still in preview and rate limited. So for going to prod we need to have a fallback to plain old GPT4 😔
So just a few try catches and we've got our fallback setup. You could use this for falling back from any model to another.
You might be wondering why I didn't use Omit<..., "model"> rather than passing each of the parameters individually. I tried this, spreading omitted object into the open ai call, but it broke the return type as it uses generics internally based on whether you pass functions in... so try to get that working at your own peril.
Now, time to make our type-safe call to open ai.
So... a lot of code there, but lets break it down.
- At the top of the file we're defining both the data we want to come back from open ai for each function, and a mirror schema in zod for each type to be passed in to our functions.
- We create a map for all the functions and the parser to sanitise the data coming back from open ai.
- We use the function we defined above to make the call with a fallback from gpt 4 turbo to regular gpt 4. If you're reading this in the future you will most likely not need this, but who knows maybe you'll swap out the models from gpt4 turbo and gpt4 to gpt5 turbo and gpt5
- We use the map we defined to call the function with the data thats sanitised with the appropriate parser. We're using Promise.all here as Open AI just announced that function calling can return multiple function calls, but adjust to your use case
Remember this is just an example to show how to make a an Open AI calls typesafe and the whole e-comm customer support thing is erroneous. This has been very useful for us where we forgot to mark fields as required in the open ai schema but they came back as undefined, or catching random bad responses from Open AI.
From there you're good to integrate OpenAI into your typescript project and be able to sleep at night knowing that the data is at least in the right format as it flows through your program.
Thanks for reading!