Upgrade to Pro — share decks privately, control downloads, hide ads and more …

大規模言語モデル (LLM) と即興 / Large Language Models (LLM) and Improvisation

大規模言語モデル (LLM) と即興 / Large Language Models (LLM) and Improvisation

2023年6月6日(火)、オンラインサロン Beyond Blockchain 第34回にて使用したスライドです。

Kenji Saito

June 06, 2023
Tweet

More Decks by Kenji Saito

Other Decks in Technology

Transcript

  1. ( ) SFC ( ) CSO (Chief Science Officer) 1993

    ( ) 2006 ( ) SFC 22 P2P (Peer-to-Peer) 2011 ( ) 2018 2019 VR 2021.7 VR 2021.9 & VR 2022.3 2023 AI VR 2023.2-3 “POWER TO THE PEOPLE” VR&RPG 2023.5 “Don’t Be So Serious” → ( ) (LLM) — Beyond Blockchain — 2023-06-06 – p.3/11
  2. ↓ DAO DAO . . . DAO ( ) vs.

    ↑ (LLM) — Beyond Blockchain — 2023-06-06 – p.4/11
  3. ChatGPT GPT ChatGPT GPT OpenAI (GPT-3.5, GPT-4) GPT Generative Pre-trained

    Transformer ( ) GPT ( ) GPT-3.5, GPT-4 (LLM) — Beyond Blockchain — 2023-06-06 – p.6/11
  4. GPT ( ) GPT Alec Radford, Karthik Narasimhan, Tim Salimans,

    and Ilya Sutskever. 2018. “Improving Language Understanding by Generative Pre-Training”. Available at: https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf. GPT-2 Alec Radford, Jeff Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever. 2019. “Language Models are Unsupervised Multitask Learners”. Available at: https://paperswithcode.com/paper/language-models-are-unsupervised-multitask. GPT-3 Tom B. Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan, Pranav Shyam, Girish Sastry, Amanda Askell, Sandhini Agarwal, Ariel Herbert-Voss, Gretchen Krueger, Tom Henighan, Rewon Child, Aditya Ramesh, Daniel M. Ziegler, Jeffrey Wu, Clemens Winter, Christopher Hesse, Mark Chen, Eric Sigler, Mateusz Litwin, Scott Gray, Benjamin Chess, Jack Clark, Christopher Berner, Sam McCandlish, Alec Radford, Ilya Sutskever, and Dario Amodei. 2020. “Language Models are Few-Shot Learners”. Available at: https://doi.org/10.48550/arXiv.2005.14165. GPT-4 OpenAI. 2023. “GPT-4 Technical Report”. Available at: https://doi.org/10.48550/arXiv.2303.08774. (LLM) — Beyond Blockchain — 2023-06-06 – p.7/11
  5. Generative Pre-Training, Language Models (GPT, GPT-2, GPT-3) : : Improving

    Language Understanding (GPT : 1.17 ) ( ) ( ) Unsupervised Multitask Learners (GPT-2 : 15 ) Few-Shot Learners (GPT-3 : 1,750 ) → GPT (LLM) — Beyond Blockchain — 2023-06-06 – p.8/11
  6. ( ) ( ← ChatGPT ) ( ) GPT-4 BibTEX

    ACM HTML (abstract) ( ) (LLM) — Beyond Blockchain — 2023-06-06 – p.9/11