It is a language model which uses great knowledge to produce text that is human-like. It is the third-generation language prediction model in the GPT-n series created by OpenAI, a San Francisco-based artificial intelligence research laboratory. The complete version of GPT-3 has a machine learning capability of 175 billion. GPT-3, introduced in May 2020, is part of a trend in the processing of the natural language systems of pre-trained language representation, in beta testing since July 2020. The largest language model, released in February 2020 in Microsoft’s Turing NLG, was the GPT-3 release, which has a potential of 17 billion or less than 10 percent , compared to GPT-3.
Thirty-one OpenA I researchers and engineers have published an initial May 28, 2020 paper presenting GPT-3, which is so high in consistency that it is so difficult to differentiate from that written by a person who has both advantages and risks. They warned of the possible hazards of GPT-3 in their paper and demanded studies to reduce harm. GPT-3 was described as “one of the most fascinating and important AI systems ever developed” by Australian philosopher David Chalmers.
On 22 September 2020, Microsoft declared it had licensed “special” GPT-3 use; others may also use the Public API for production, but Microsoft only has source code access.