/2 GPT 1 was trained with BooksCorpus dataset (5GB), whose main focus is language understanding.
On Valentine’s Day 2019, GPT 2 was released with the slogan “too dangerous to release”. It was trained with Reddit articles with over 3 likes (40GB). The training cost is $43k.
On Valentine’s Day 2019, GPT 2 was released with the slogan “too dangerous to release”. It was trained with Reddit articles with over 3 likes (40GB). The training cost is $43k.
/3 Later GPT 2 was used to generate music in MuseNet and JukeBox.
/4 In June 2020, GPT 3 was released, which was trained by a much more comprehensive dataset.
More applications were developed based on GPT 3, including:
More applications were developed based on GPT 3, including:
/5 🔹DALL-E: creating images from text
🔹CLIP: connecting text and images
🔹Whisper: multi-lingual voice to text
🔹ChatGPT: chatbot, article writer, code writer
🔹CLIP: connecting text and images
🔹Whisper: multi-lingual voice to text
🔹ChatGPT: chatbot, article writer, code writer
/6 I strongly recommend you play with these applications. The results are astonishing!
👉 Over to you: Have you chatted with ChatGPT? What did you ask it?
👉 Over to you: Have you chatted with ChatGPT? What did you ask it?
/7 I hope you've found this thread helpful.
Follow me @alexxubyte for more.
Like/Retweet the first tweet below if you can:
Follow me @alexxubyte for more.
Like/Retweet the first tweet below if you can:
Loading suggestions...