• Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020. Like its predecessor, GPT-2, it is a decoder-only transformer...
    54 KB (4,915 words) - 00:40, 2 October 2024
  • translation. GPT-4o scored 88.7 on the Massive Multitask Language Understanding (MMLU) benchmark compared to 86.5 by GPT-4. Unlike GPT-3.5 and GPT-4, which...
    17 KB (1,787 words) - 13:48, 6 October 2024
  • that the iteration of ChatGPT using GPT-4 was an improvement on the previous iteration based on GPT-3.5, with the caveat that GPT-4 retains some of the problems...
    62 KB (6,004 words) - 11:40, 15 October 2024
  • Thumbnail for ChatGPT
    ChatGPT is a generative artificial intelligence chatbot developed by OpenAI. Launched in 2022 based on the GPT-3.5 large language model (LLM), it was later...
    199 KB (17,238 words) - 02:16, 18 October 2024
  • Thumbnail for GPT-2
    superseded by the GPT-3 and GPT-4 models, which are no longer open source. GPT-2 has, like its predecessor GPT-1 and its successors GPT-3 and GPT-4, a generative...
    44 KB (3,260 words) - 23:33, 12 August 2024
  • Thumbnail for Generative pre-trained transformer
    A generative pre-trained transformer (GPT) is a type of large language model (LLM) and a prominent framework for generative artificial intelligence. It...
    50 KB (4,444 words) - 06:00, 15 October 2024
  • Thumbnail for ChatGPT in education
    Since the public release of ChatGPT by OpenAI in November 2022, the integration of chatbots in education has sparked considerable debate and exploration...
    31 KB (3,250 words) - 04:34, 16 October 2024
  • Thumbnail for AutoGPT
    OpenAI's GPT-4 or GPT-3.5 APIs, and is among the first examples of an application using GPT-4 to perform autonomous tasks. On March 30, 2023, AutoGPT was released...
    15 KB (1,592 words) - 00:25, 23 September 2024
  • OpenAI (category 501(c)(3) organizations)
    compared to 86.5% by GPT-4. On July 18, 2024, OpenAI released GPT-4o mini, a smaller version of GPT-4o replacing GPT-3.5 Turbo on the ChatGPT interface. Its...
    196 KB (16,895 words) - 09:08, 17 October 2024
  • whitespace in RoBERTa and GPT. "##" denotes continuation of a preceding word in BERT. For example, the BPE tokenizer used by GPT-3 (Legacy) would split tokenizer:...
    157 KB (13,444 words) - 12:02, 15 October 2024