Software:GPT-5
Developer(s) | OpenAI |
---|---|
Initial release | Unknown |
Predecessor | GPT-4 |
Type |
|
License | Proprietary |
Machine learning and data mining |
---|
On July 18, 2023, OpenAI filed for the Trademark GPT-5 for classes 9 and 42. UPSTO Serial Number 98089548 [1][2][3]
Generative Pre-trained Transformer 5 (GPT-5) has not been released yet, but it's expected to be a multimodal large language model created by OpenAI, and the fifth in its numbered "GPT-n" series of GPT foundation models.
Background
OpenAI introduced the first GPT model (GPT-1) in 2018, publishing a paper called "Improving Language Understanding by Generative Pre-Training."[4] It was based on the transformer architecture and trained on a large corpus of books.[5] The next year, they introduced GPT-2, a larger model that could generate coherent text.[6] In 2020, they introduced GPT-3, a model with 100 times as many parameters as GPT-2, that could perform various tasks with few examples.[7] GPT-3 was further improved into GPT-3.5, which was used to create the chatbot product ChatGPT.
References
- ↑ USPTO GPT-5 Trademark Registration https://tmsearch.uspto.gov/bin/showfield?f=doc&state=4805:xgx403.2.1
- ↑ https://futurism.com/the-byte/openai-trademark-gpt-5
- ↑ https://lifearchitect.ai/gpt-5/
- ↑ Radford, Alec; Narasimhan, Karthik; Salimans, Tim; Sutskever, Ilya (June 11, 2018). "Improving Language Understanding by Generative Pre-Training". https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf.
- ↑ Khandelwal, Umesh (April 1, 2023). "How Large Language GPT models evolved and work". https://www.linkedin.com/pulse/how-large-language-gpt-models-evolved-work-umesh-khandelwal.
- ↑ "What is GPT-4 and Why Does it Matter?". April 3, 2023. https://www.datacamp.com/blog/what-we-know-gpt4.
- ↑ Brown, Tom B. (July 20, 2020). "Language Models are Few-Shot Learners". arXiv:2005.14165v4 [cs.CL].
Original source: https://en.wikipedia.org/wiki/GPT-5.
Read more |