Language teaching methods based mainly on grammar rules have long been criticised. Language pedagogues now suggest reading, listening, and writing exercices rather than grammar exercises for a better comprehension of a foreign language. It turns out that artificial intelligence models agree with them as well.

Traditionally, Natural Language Processing (NLP) models would analyse sentences by sequences: adjective, noun, verb, adverb. This is a quite laborious task and even though it fonctions well enough, today, different methods, especially artificial neural networks, are being used for text analysis. A good example of this would be GPT-3. 

 How GPT-3 works?

The GPT-3 model is based on a technology that is called transformers. They analyse huge amounts of data to understand how sentences are built, which words follow the others, and which words make sense the most. In other words, these models are supposed to learn the semantics and not only a set of rules. 

Differently from other transformers models, GPT-3 is trained not only with articles but sentence and paragraph patterns. Thus, just like how a beginner language learner would understand a single sentence or a paragraph much easier than of a whole article, the GPT 3 model developed a good understanding of the natural language thanks to this training method

Before self-supervised algorithms, supervised technics that were in use would separated sentences into nouns, adjectives, and verbs… When a foreign language learner is told to study only grammar rules, it is difficult to start understanding that language because a language is a lot more than being only a set of rules. 

What these models can change?

Thanks to this specific training model, GPT-3 not only creates human-like texts but it can also compose music, write poetry, translate a text or code. Often, people have difficulties to say either its outputs are created by a human or a machine.

This single evolution can facilitate web development, machine translation, content creation, image recognition, and everything else that can be done with artificial intelligence… 

Today, a GPT 3 model contains 175 billion parameters which is a lot more than its predecessor GPT-2 and other neural networks of its kind. Even though this amount is huge, compared to a human brain with around a hundred trillion synapses, there is still quite a way to go for machines to reach a human brain’s capacity. 

If you are looking for a complete overview of GPT-3, I recommande you read Alberto Romero’s blog post on Medium!