GPT-3 Is Only the Beginning: Intro to Language Model's Capabilities

Rating — 4.9·13 min·May 31, 2021

In February 2018, Microsoft shook the IT industry with its powerful Turing-NLG. This 17-billion-parameter language model can easily handle natural language processing tasks, answer questions, summarize large texts, and do so much more. The tech community couldn’t stop discussing it: How good is it? How powerful is it? Reddit boomed with discussions. No previous AI-based language model had demonstrated such capabilities.

A few months later, OpenAI crushed the hype and amazed the tech industry yet again by launching a language model 10 times more powerful than Turing-NLG with a capacity of 175 billion machine learning parameters. The GPT-3 model by OpenAI isn't just capable of summarization or simple reasoning: GPT-3's potential is so great that shortly after its beta release WIRED claimed it was "provoking chills across Silicon Valley."

The attention of all tech leaders is now directed at GPT-3. The discussions and hype have multiplied. But there’s a drop of suspicion and fear, too.

So what is GPT-3? And why have online tech channels and the mass media exploded with this topic? Let’s start by explaining GPT-3 in simple words.

What is GPT-3? The story and technologies behind a language model

Each of us had that super-smart classmate in school. The kid who already knew nearly everything you had just heard for the very first time during the lesson; the kid who could correct the teacher or even scoffed at the teacher’s mistakes. This local Sheldon Cooper had his (or her) own opinion. At the age of, say, ten, he had read so many books and had such innate intelligence that he could analyze facts and come to logical conclusions much faster and easier than most kids at school.

Now, imagine you didn’t see this kid for the entire summer holidays. During that time, he learned by heart the entirety of Wikipedia, read billions of books, and processed billions of terabytes of texts published on the web. Can you imagine his potential? He could easily perform complex mathematical operations, discuss any topic, answer any question, demonstrate excellent knowledge in any scientific field, and, what’s most impressive, could find the right answer in less than a second.

The catch is that we’re not actually talking about a kid. It’s the GPT-3 AI-powered language model that processed the entirety of Wikipedia — and that’s only 3% of its entire knowledge base. It was already pretty powerful in the form of GPT-2 — the predecessor of GPT-3. GPT-2 was presented in November 2019. Then seven months later, its younger brother entered the world. During the interim, engineers from OpenAI had been working on a model upgrade, teaching GPT-2 and fixing it.

Now, GPT-3 can serve as an impeccable AI assistant, generate texts on any topic, write code in many programming languages, convert plain English to a layout or SQL query, and do so much more.

What made all that possible?

Founders and funding

GPT-3 is a product created by OpenAI, a for-profit AI research laboratory. OpenAI was founded in 2015 by several entrepreneurs including Elon Musk, who left the management board in 2018 but still remains an investor, and Sam Altman, a former president of Y Combinator. Besides GPT models, the company works on reinforcement learning products. OpenAI has raised over $1 billion in two funding rounds.

openai funding

Machine learning technologies

Artificial intelligence and machine learning are behind all language models. Since 2010, a massive revolution in machine learning technologies has made it possible to delegate a variety of tasks to machine learning models. These models are “trained” by “feeding” them immense volumes of data. They then process, analyze, and “study” this data and complete tasks based on their acquired knowledge.

More powerful language models are based on deep learning techniques. Deep learning AI systems don’t need human supervision to study. They mimic human thinking and are able to recognize speech and make decisions based on pre-learned knowledge. Deep learning allows a machine learning model not only to collect and process data but to understand it and generate relevant answers.

In 2018, engineers from OpenAI shared the idea of generative models with the world. These models can analyze input data and generate the next unit in a sequence. For example, a generative model can analyze a text file and then generate the next paragraph, or it can complete a sentence related to the topic of the text. OpenAI engineers managed to pre-train a generative model with a variety of texts; they called this process generative pre-training (GPT). This is how GPT-2 and GPT-3 were born.

Training process

The typical pre-training approach includes a model itself and a dataset. For GPT-3 pre-training, the OpenAI team used four approaches:

  • Fine-tuning

The model is trained with a vast number of datasets.

  • Few-shot

The model is given several demonstrations of how to complete a certain task.

gpt-3 few shots pre-training approach

  • One-shot

The model is given a text explanation of a task and only demonstration of its completion.

gpt-3 one shot pre-training approach

  • Zero-shot

The model is given only a task description in English.

gpt-3 zero shot pre-training approach

GPT-3 is the largest language model to date. The pre-training dataset comes from:

Data sources

Weight in training

Common Crawl (filtered version)

60%

Web texts

22%

Books

16%

Wikipedia

3%

GPT-3 is capable of translating to and from a variety of languages, knows billions of words, and is even capable of coding! Because of all the data GPT-3 has at hand, it requires no further training to fulfil language tasks.

Would it be a challenge to differentiate text or code written by GPT-3 from text or code written by a human? You’ll soon find out.

GPT-3 vs GPT-2: What changed?

In neural networks, parameters are responsible for the connections between artificial neurons. They’re required by the pre-trained model for decision-making, and the more parameters a neural network consists of, the more reliable the data it generates.

The main difference behind GPT-2 and GPT-3 is that GPT-2 was capable of 1.5 billion parameters, while GPT-3 is a 175-billion-parameter model. All other differences are just consequences of the fact that GPT-3 is much bigger than its predecessor. The architecture remains the same. GPT-3 adopts and scales all GPT-2 capabilities, allowing it to produce higher-quality outputs. Although it still has some issues and limitations, GPT-3 is able to demonstrate much better performance.

OpenAI API

OpenAI GPT-3 can be a helpful assistant for a variety of business tasks. It may be the smartest AI assistant you’ve ever had. Below, we’ll explain how exactly you can use it in your products.

But first, to try the potential of GPT-3 for yourself, you need to request access to the OpenAI API. There’s no open-source version of the product. But with access to the API, you can integrate powerful AI features into an existing product or a new app.

“…the API has pushed us to sharpen our focus on general-purpose AI technology—advancing the technology, making it usable, and considering its impacts in the real world. We hope that the API will greatly lower the barrier to producing beneficial AI-powered products, resulting in tools and services that are hard to imagine today.”

OpenAI

OpenAI gives three reasons why their API is not open-source:

  • Monetization — Selling the language model helps the company pay for research.
  • Expensive deployment — At this stage, only big enterprises can afford the model and really benefit from it.
  • Misuse — Power means responsibility, and in the case of an open-source project, it may be hard to track how people use this AI technology.

As of September 2020, the OpenAI API belongs exclusively to Microsoft. You can still access the API and use it in your projects, but Microsoft controls the source code.

Let’s see what this means for real-world projects.

GPT-3 applications, or how GPT-3 serves real products

Machine learning, artificial intelligence, and neural networks make no sense until you can clearly see their impact. Without real applications, GPT-3 is just a strange abbreviation that tech people are keen on. In this section, you’ll find out what you can actually do with the biggest language model so far.

OpenAI says there are tens of thousands of potential uses of GPT-3. We’ve picked the most impressive ideas. Check them out.

gpt-3 for your product

GPT-3 as a creative storyteller

The Guardian posted an entire article written by GPT-3. In it, the AI states that it’s not here to harm humanity.

“...I have no desire to wipe out humans. In fact, I do not have the slightest interest in harming you in any way. Eradicating humanity seems like a rather useless endeavor to me. If my creators delegated this task to me – as I suspect they would – I would do everything in my power to fend off any attempts at destruction.”

It doesn’t sound like a machine-written text, does it?

There are multiple more examples of how GPT-3 can assist with storytelling. There’s even a website where you can use GPT-3 to talk to a celebrity or a historical figure. The AI analyzes an individual’s biography, interviews, and other information available on the web and generates responses or stories mimicking a person’s tone.

Practical applications

  • Entertainment apps

Following the most popular examples of the first GPT-3-based apps, you may build an entertainment app and let people talk to their favorite writers, local celebrities, or fictional characters using GPT-3.

  • Educational apps

Now that most students are studying online, it may be useful to implement GPT-3 in educational apps. The AI can not only summarize entire lectures but can explain topics, discoveries, or experiments using the words of scientists who worked on them.

  • Personal assistants

GPT-3 may become Siri on steroids — an AI assistant that can turn your text or voice commands into actions, queries, decisions, blog posts, code, and so much more.

GPT-3 as the omniscient AI

Talking to artificial intelligence became real with GPT-3. Trained with a huge amount of data, GPT-3 is able to provide relevant answers to any questions, whether they’re about the NBA Finals or the existence of God. It’s marvellous how artificial intelligence can analyze the creation of Earth and cogitate on the role of a creator. Check out this article on Medium that reproduces an entire conversation with the GPT-3 AI.

Practical applications

  • Chatbots

You can use GPT-3 to upgrade a chatbot you use on your business website or social media page. Your smart assistant may easily answer most of your customer’s questions and pass difficult queries or tasks to a human operator.

  • Customer assistance and support

Enterprises hire thousands of tech support specialists annually. With wild staff turnover in this sphere, a customer support AI may save a company tons of hours on recruitment and millions of dollars on salaries.

  • Search engines

Powered by GPT-3, search engines may provide much higher value. Instead of opening a search results page with dozens of links, a search engine may answer a question directly. OpenAI shares one of the possible cases: a GPT-3 plugin integrated in a browser may not only perform search by keywords but also correctly answer users' questions.

GPT-3 as a multilingual tool

It goes without saying that GPT-3 can translate words from many languages into English and vice versa. What’s more notable is how the AI translates plain English to… SQL. Or JSX code. Or CSS. GPT-3 is capable of building an entire website interface by itself, with no human supervision. Twitter is booming with amazing examples of how tech specialists have implemented GPT-3.

Practical applications

  • Translation and education

GPT-3 is exceptionally good at translations. For students studying foreign languages, it may be of great help. And if you just want to run a multilingual website, GPT-3 can translate content for you.

  • Coding and developing SQL queries

We expect that in the future, there will be multiple software engineering tools built on top of GPT-3. Simple queries, CSS and Python code, and so much more can be written by a machine.

  • Layout generation and UI design

Just input text and get a ready-made design. Isn’t that a great way to save time and money on design tasks?

Developers are also taking steps to use GPT-3 in server-side development. For example, it may be used to develop an NLP app in Python. And that’s just the beginning.

GPT-3 as a digital musician

Just enter a few chords to get started and GPT-3 can compose an entire symphony. Knowing most of the songs that exist on the World Wide Web, the AI model can write its own music.

Practical applications

  • Software products for songwriters and musicians

The opportunities of GPT-3 are virtually unlimited. Depending on real market needs, GPT-3 may become the basis of any handy and beneficial tool.

Drawbacks and criticism of GPT-3

The GPT-3 language model seems to be even more than a perfect assistant: it can generate an essay in a few seconds, answer any question, generate music, create designs, and build software components. Simply input a piece of text and get anything you need in response: an answer, a SQL query, or the 8th Harry Potter book.

However, when looking for ways to implement GPT-3 in your business, you may miss some of its limitations, weaknesses, and even dangers.

dangers of gpt-3

  • Potential dangers

While the majority of the tech community is inspired by and willing to try this great product from OpenAI, there are numerous IT entrepreneurs and analysts who are suspicious about the impact of GPT-3 on industries and people’s lives. New York Times opinion columnist Farhad Manjoo calls GPT-3 “more than a little terrifying.” When humans create something so powerful and intelligent, it’s hard to predict whether its impact on our lives will be entirely positive.

  • Errors and imperfections

On learnfromanyone.com, you can use GPT-3 to start a conversation with Steve Jobs (or anyone else). As users share on Twitter, the AI claims that Jobs’ current location is Apple’s headquarter in California. Although it may be true that his soul prefers staying in the office instead of resting in peace, we can’t consider GPT-3 to be a reliable source of information.

GPT-3 may also demonstrate some troubles with logic. For example, GPT-3 fails to answer what number comes before one million. Another fascinating fact is that the AI is rather reluctant to confess it doesn’t know the answer to a question. It’s more eager to guess an answer than to show the “answer not found” response.

Theoretically, GPT-3 may replace a human writer or storyteller. However, logical slips and semantic errors are rather noticeable in some GPT-3-generated answers.

  • Offensive content

GPT-3 primarily relies on information published on the internet. Thus, its ability to generate loyal, trustworthy and tolerant content suffers. Texts it generates or answers it gives related to sexuality, religion, or politics may be rude and offensive. So again, human checks and proofreading are required.

Final thoughts

Despite potential risks and limitations, GPT-3 may considerably change many industries — and your own business. Trained the right way and with a drop of supervision, GPT-3 can serve you with:

  • Natural language processing and translation

Google has been working on speech translation products for years. Now, GPT-3 can improve the process and make live translations faster and more accurate.

  • Storytelling and text generation

Writing an essay is a huge challenge. Creating a blog post is time-consuming. Products built on GPT-3 may eliminate these inconveniences and allow users to generate texts on any topic in a few clicks.

  • Web and mobile app design and development

Tools that translate plain English to CSS code, SQL queries, or even Python may help engineering teams.

There are many ideas for how to use this 175-billion-parameter language model to improve people’s lives, and many ambitious entrepreneurs are about to implement GPT-3 and its capabilities in their businesses.

Are you trying to decide how to use GPT-3 to benefit your business?
Drop us a line and we’ll share some more ideas and insights.

Tags

All Topics+14

Reviews: 0

5.0
Rate us 5 stars!
GPT-3 Is Only the Beginning: Intro to Language Model's Capabilities
Any questions unanswered?
Let's discuss them

Want to know more about the project cost?

Feel free to contact us!
hello@clockwise.software
By submitting this form, you agree to Clockwise Software Privacy Policy.