Artificial Intelligence Deep Learning Machine Learning Any Technique that enables computers to mimic human intelligence & behaviour A subset of ML exposing multilayered neural networks to vast amount of data A subset of AI that includes statistical techniques to solve the tasks using experience
ChatGPT Generative Pretrained Transformer Generative models are used to generate new outputs from scratch. So, given a prompt, it can imagine the rest. In the case of GPT, it can only generate text. Other models can also generate images, sounds, etc. (DALL-E, Stable Diffusion, Riffusion, ...)
ChatGPT Generative Pretrained Transformer Model pretrained on *LOTS of data. Think ~all of the publically available text on the internet. * Models trained on LOTS of data are called Large Language Models (LLM). GPT is an example of an LLM, but there exist many others too (Flan-T5, BLOOM, ...)
ChatGPT Generative Pretrained Transformer A specific architecture of neural networks. Transformers were first published in 2017 at NeurIPS by researchers at Google. This was a watershed moment for LLMs. It allowed researchers to train models efficiently on huge datasets. https://arxiv.org/pdf/1706.03762.pdf
Prompts in ChatGPT During training, ChatGPT made billions of connections between trillions of words, and is able to statistically predict the next word for any ‘prompt’ (like a query or question)
Auto-GPT Based on GPT-4, an open source experimental tool that can come up with ‘self-prompts’ and work autonomously. Also, includes internet connectivity!
Github Copilot: Your AI Pair Programmer https://github.com/features/copilot Uses OpenAI GPT-3 to generate human-like text and is trained on publicly available code from GitHub. It is available in Visual Studio Code, JetBrains, and Neovim.