A robot will replace me…and probably you, too.
AI is evolving faster than humans. (Image: QuickMeme)
Artificial Intelligence has come a long way in recent years, evolving from simple chatbots to more sophisticated Generative Pre-trained Transformer (GPT) models. The first generation of AI was the GPT-1 system which was launched in 2018 by OpenAI, an artificial intelligence research laboratory. It was the first to crack text-based communication, enabling computers to understand and "think" like humans. While this was a significant breakthrough at the time, its performance was somewhat limited.
Next came the GPT-2 model in 2019. It used complex neural networks and large amounts of data to produce more accurate and relevant text. It demonstrated remarkable improvements in reading comprehension and linguistic understanding with features such as multi-sense semantics and natural language generation. This marked a notable improvement over GPT-1, but it still had room for further refinement.
Today, we are seeing the third generation of AI in the form of GPT-3 systems. These advanced algorithms not only use massive amounts of training data, but also incorporate various approaches such as reinforcement learning and self-training to create even more realistic results.
The popularity of GPT-3 is mainly because it can mimic human writing and reading comprehension better than humans as it has seen more text content than any human will ever read in their lifetime.
It also has broader applications extremely powerful to digitally transform any enterprise. Its uses include improved customer service, responding to employee queries, and automation of tasks. For instance, it can write simpler versions of complicated technical instructions without human intervention.
GPT-4 is coming soon
OpenAI’s CEO, Sam Altman, said a few months ago that GPT-4 is coming. Current estimates forecast the release date sometime this year, likely around July-August.
Despite being one of the most awaited AI news, there’s little public info about GPT-4, what it’ll be like, its features, or its abilities. The model will certainly be big compared to previous generations of neural networks, but size won’t be its distinguishing feature.
In an interview last year, Altman revealed that contrary to popular belief, GPT-4 will not be any bigger than GPT-3 but will use more compute resources. One thing he said for sure is that GPT-4 won’t have 100 Trillion parameters.
Apart from GPT-4, Altman also gave a sneak-peek into GPT-5. He said that GPT-5 might be able to pass the Turing test, a method for determining whether a system can think like a human being. This aligns with OpenAI’s endeavor to ultimately achieve artificial general intelligence.
As we continue to push the limits of what AI can do, there is no doubt that these innovative algorithms will continue to evolve and improve. The future is bright for AI technologies. On the contrary, it seems grim for us humans. Can we keep up?
And by the way, I co-wrote this with Jasper, an AI copywriting tool that uses GPT-3 and machine learning to automatically produce copy. Can you guess which parts were written by a machine?