Whoah ok, @OpenAI has just answered tons of developer requests, let's dig into this.
You know how many folks struggled to get a JSON output consistently? For the use of agents and other stuff?
Well, OpenAI took it 1 step further and gave us function calls! How do they work๐
First of all, many developers have struggled with consistent JSON output to their models (yes including me)
This led to hacks like trAI/except and other methods, I saw folks switching to yaml and xml among other solutions
So having native JSON response is just awesome ๐ฏ
twitter.com/altryne/status/1632253117827010566
However @OpenAI took this 1 step further, by asking why do you even need a JSON output?
Well, to do something with this data!
So why not just... provide our API with your function and what it needs to get as arguments? And the model will return the right function call!
This way, you can then run your function with LLM data, via an external API, or a tool or whatever, receive answers, and then send it back to the Model to summarize, extract details, whatever.
With the new "function" role in the messages array, you can tell GPT about your fn
Running to try this out, this seems like a major shift in the developer experience for these models, and we essentially are getting the benefits of the plugin ecosystem (cc @OfficialLoganK ? ) into the API calls.
๐