Since the launch of OpenAI’s ChatGPT beta in November 2022, the LLM (Large Language Model) chatbot race has heated up in Q1 of 2023. Google announced its own generative chatbot, Bard, a month ago, and Meta subsequently introduced its own LLM called LLaMA.
Then, last week, OpenAI turned the temperature up by more than a few degrees by releasing its API. Previously, the LLM was only accessible by directly using the chatbot interface. In this turn of events, the world will be using it in many of its favourite everyday apps and products, as third-party developers can now integrate ChatGPT into them.
Bootstrapped startups will have a lot of liberty to do weird and wonderful things with ChatGPT, thanks to the API being very affordable, blessed by a 90% reduction in costs since December. It is ten times cheaper than previous GPT-3.5 models, with computational costs of just $0.002 per 1,000 tokens.
This move has left the tech world sitting at the edge of its chair eager to see how Google will respond, as it kicks itself for letting Microsoft steal the limelight.
First off: what is an API?
An application programming interface (API) connects one program to another, allowing them to share data. You can imagine an API as a bridge between two islands that allows the islanders to run across, talk, and collaborate. Or, as a set of walkie talkies, opening up the opportunity for teamwork between two software companies.
Examples frequently encountered in the wild include Skyscanner’s API, which retrieves thousands of airline and hotel prices and availability from many supplies, and the API which integrates Google Sign-In with various web apps.
How have companies been responding?
Internally, enterprises spanning all industries were already bending over backwards to:
Draft LLM usage policies for written materials;
Evaluate risks to their sector; hosting ChatGPT workshops;
Identify processes for automation;
Form teams dedicated to Natural Language Processing; and
Forge partnerships with conversational AI specialists.
Now, enterprises are scrambling to adapt to this breakthrough. If they haven’t yet built their first ChatGPT application, many have at least built a prototype. The frontrunners, mentioned by OpenAI itself in its blog post, include Snapchat, Instacart, Shopify, Quizlet, and the language-learning app Speak. It has since made Siri cleverer, and automated python coding.
The cross-sector impact of the ChatGPT API
If ChatGPT going viral within weeks wasn’t enough publicity for OpenAI, prepare to see it everywhere now that the API is public.
With the low price comes room for innovation and unique projects. Crucially, the API won’t only be adopted by huge corporations like the frontrunner Snapchat. The business model also favours startups and SMEs, because APIs are typically priced on usage rather than on licence fees, which involve a large lump sum.
People will use the API to find efficiencies here and there, but the areas that will experience the greatest disruption (and casualties!) are copywriting, proofreading, research and report writing.
To forecast this impact, we can look back to the tools created with the GPT-3 API since 2020 in the writing sphere, in which AI writing assistants have cropped up. The copywriting tool Jasper generates blog content for you once you’ve entered a few key details about your topic; the website builder Zyro generates website text; and the ‘creative writing partner’ ChibiAI is designed to help you get over writer’s block and predict what you’ll write next by suggesting new ideas as you go.
So far, straight-forward tools people have created include the Summarize extension, which summarises any webpage; an extension that summarises Youtube videos; and a Twitter bot that you can mention with a prompt, and it will respond with a tweet containing the ChatGPT response.
For specific applications like litigation or risk analysis, if a company finds a way to automate a significant chunk of work with ChatGPT, it may make it seriously difficult to be price-competitive against other vendors. For once, this price dip is unlikely to come with a fall in quality of output, as only the mundane non-strategic parts of the process will be ChatGPT-automatable.
What risks should enterprises be aware of?
There remains the importance of handling generative AI ethically and responsibly, with safeguards and wisdom.
Fears always surround LLMs’ potential to fall into the wrong hands and be weaponised as instruments of radicalisation and mass-generated fake news, and the API will only make this easier. This is especially pertinent when LLMs get leaked before they are ready – a download link for Meta’s LLaMA was posted to 4Chan last week.
ChatGPT was trained in such a way that makes it convincing to gain approval from a human with limited time to review an answer. Enterprises must remain vigilant and not fall for that trick for these tools built on top of it.
Regarding well-meaning enterprises, the guiding principle for utilising ChatGPT in any sector that handles sensitive information that has become clear is: AI systems should not immediately be put into production with complete autonomy, but rather first be deployed as assistive tools with a human-in-the-loop.
Hence, enterprises looking to deploy the API be prepared to form a dedicated team or turn to trusted partner like Springbok.
How will this steer the Google vs Microsoft rivalry?
If Google already had a lot on its plate, the tech giant now has an extra large serving of breakfast to work its way through.
The release of a ChatGPT API was long-anticipated, but this was slightly quicker than expected – perhaps under pressure after the announcement of Google Bard. And, in turn, OpenAI’s API will likely put pressure on Google to release its models as soon as possible. Indeed, Google told the New York Times in January that it plans to announce more than 20 AI-powered projects throughout 2023.
While we await Bard storming the stage, the ChatGPT API has also made it possible to plug ChatGPT into Google.
This article is part of a series exploring ChatGPT and what this means for the chatbot industry. Others in the series include a discussion on the legal and customer experience (CX) sectors, how a human-in-the-loop can mitigate risks, and the race between Google and Microsoft to LLM-ify the search engine.
Springbok have also written the ChatGPT Best Practices Policy Handbook in response to popular client demand. Reach out or comment if you'd like a copy.
If you’re interested in anything you’ve heard about in this article, reach out at [email protected]!
Chief of Product & Solutions
Jason is a co-founder and the Chief of Product & Solutions at Springbok. His previous experience includes working as a Data Science/ Software engineer at Jaguar Land Rover. Jason has led the software engineering delivery of 4 of Springbok’s most significant chatbot projects to date.
A new era of productivity: Prompt Architected Software
Explore 'Prompt Architecting' for Generative AI in business, optimizing workflows with LLMs like ChatGPT for efficient HR and legal solutions.
Should you choose GPT-3.5 or GPT-4?
Discover GPT-4's edge over GPT-3.5 in legal applications, offering enhanced AI for document processing, chatbots, and business efficiency.
The Springbok Artificial Intelligence Glossary
Unveil the world of AI with our glossary, covering Generative AI, LLMs, NLP, and more, essential for integrating AI into business applications.