Slide 1

Slide 1 text

Ines Montani Explosion

Slide 2

Slide 2 text

Developer tools company specializing in AI, machine learning and NLP. explosion.ai EXPLOSION

Slide 3

Slide 3 text

Developer tools company specializing in AI, machine learning and NLP. explosion.ai EXPLOSION Ines Montani Founder and CEO

Slide 4

Slide 4 text

Developer tools company specializing in AI, machine learning and NLP. explosion.ai EXPLOSION Ines Montani Founder and CEO Matthew Honnibal Founder and CTO

Slide 5

Slide 5 text

Open-source library for industrial-strength natural language processing spacy.io SPACY 200m+ downloads

Slide 6

Slide 6 text

Open-source library for industrial-strength natural language processing spacy.io SPACY ChatGPT can write spaCy code! 200m+ downloads

Slide 7

Slide 7 text

Modern scriptable annotation tool for machine learning developers prodigy.ai PRODIGY 9k+ 800+ users companies

Slide 8

Slide 8 text

Modern scriptable annotation tool for machine learning developers prodigy.ai PRODIGY 9k+ 800+ users companies Alex Smith Developer Kim Miller Analyst

Slide 9

Slide 9 text

Collaborative data development platform prodigy.ai/teams PRODIGY TEAMS BETA

Slide 10

Slide 10 text

Collaborative data development platform Alex Smith Developer Kim Miller Analyst GPT-4 API prodigy.ai/teams PRODIGY TEAMS BETA

Slide 11

Slide 11 text

WHY OPEN SOURCE?

Slide 12

Slide 12 text

WHY OPEN SOURCE? transparent

Slide 13

Slide 13 text

WHY OPEN SOURCE? transparent no lock-in

Slide 14

Slide 14 text

WHY OPEN SOURCE? transparent no lock-in extensible

Slide 15

Slide 15 text

WHY OPEN SOURCE? transparent no lock-in extensible runs in-house

Slide 16

Slide 16 text

WHY OPEN SOURCE? transparent no lock-in extensible runs in-house easy to get started

Slide 17

Slide 17 text

WHY OPEN SOURCE? transparent no lock-in extensible community-vetted runs in-house easy to get started

Slide 18

Slide 18 text

WHY OPEN SOURCE? transparent no lock-in programmable extensible community-vetted runs in-house easy to get started

Slide 19

Slide 19 text

WHY OPEN SOURCE? transparent no lock-in up to date programmable extensible community-vetted runs in-house easy to get started

Slide 20

Slide 20 text

WHY OPEN SOURCE? transparent no lock-in up to date programmable extensible community-vetted runs in-house easy to get started also free!

Slide 21

Slide 21 text

OPEN-SOURCE MODELS

Slide 22

Slide 22 text

task-specific models OPEN-SOURCE MODELS

Slide 23

Slide 23 text

task-specific models small, often fast, cheap to run, don’t always generalize well, need data to fine-tune OPEN-SOURCE MODELS

Slide 24

Slide 24 text

encoder models ELECTRA T5 task-specific models small, often fast, cheap to run, don’t always generalize well, need data to fine-tune OPEN-SOURCE MODELS

Slide 25

Slide 25 text

encoder models ELECTRA T5 task-specific models small, often fast, cheap to run, don’t always generalize well, need data to fine-tune OPEN-SOURCE MODELS

Slide 26

Slide 26 text

encoder models ELECTRA T5 task-specific models small, often fast, cheap to run, don’t always generalize well, need data to fine-tune relatively small and fast, affordable to run, generalize & adapt well, need data to fine-tune OPEN-SOURCE MODELS

Slide 27

Slide 27 text

encoder models ELECTRA T5 task-specific models small, often fast, cheap to run, don’t always generalize well, need data to fine-tune relatively small and fast, affordable to run, generalize & adapt well, need data to fine-tune OPEN-SOURCE MODELS large generative models Falcon MIXTRAL

Slide 28

Slide 28 text

encoder models ELECTRA T5 task-specific models small, often fast, cheap to run, don’t always generalize well, need data to fine-tune relatively small and fast, affordable to run, generalize & adapt well, need data to fine-tune OPEN-SOURCE MODELS large generative models Falcon MIXTRAL very large, often slower, expensive to run, generalize & adapt well, need little to no data

Slide 29

Slide 29 text

encoder models large generative models ENCODING & DECODING TASKS

Slide 30

Slide 30 text

encoder models large generative models ENCODING & DECODING TASKS network trained for specific tasks using model to encode input ! model " text vectors ! task model task output # task network labels

Slide 31

Slide 31 text

encoder models large generative models ENCODING & DECODING TASKS model generates text that can be parsed into task-specific output " text ! model raw output ⚙ parser task output % template prompt network trained for specific tasks using model to encode input ! model " text vectors ! task model task output # task network labels

Slide 32

Slide 32 text

encoder models ELECTRA T5 task-specific models small, often fast, cheap to run, don’t always generalize well, need data to fine-tune relatively small and fast, affordable to run, generalize & adapt well, need data to fine-tune OPEN-SOURCE MODELS large generative models Falcon MIXTRAL very large, often slower, expensive to run, generalize & adapt well, need little to no data

Slide 33

Slide 33 text

encoder models ELECTRA T5 task-specific models small, often fast, cheap to run, don’t always generalize well, need data to fine-tune relatively small and fast, affordable to run, generalize & adapt well, need data to fine-tune OPEN-SOURCE MODELS large generative models Falcon MIXTRAL very large, often slower, expensive to run, generalize & adapt well, need little to no data

Slide 34

Slide 34 text

output costs ECONOMIES OF SCALE

Slide 35

Slide 35 text

output costs OpenAI Google ECONOMIES OF SCALE

Slide 36

Slide 36 text

output costs OpenAI Google ECONOMIES OF SCALE access to talent, compute etc.

Slide 37

Slide 37 text

output costs OpenAI Google ECONOMIES OF SCALE access to talent, compute etc. API request batching

Slide 38

Slide 38 text

output costs OpenAI Google ECONOMIES OF SCALE access to talent, compute etc. API request batching high traffic & & & & & & & & low traffic batch & & & & & & & & …

Slide 39

Slide 39 text

output costs OpenAI Google you ' ECONOMIES OF SCALE access to talent, compute etc. API request batching high traffic & & & & & & & & low traffic batch & & & & & & & & …

Slide 40

Slide 40 text

human-facing systems machine-facing models ChatGPT GPT-4 AI PRODUCTS ARE MORE THAN JUST A MODEL

Slide 41

Slide 41 text

human-facing systems machine-facing models ChatGPT GPT-4 most important differentiation is product, not just technology AI PRODUCTS ARE MORE THAN JUST A MODEL

Slide 42

Slide 42 text

human-facing systems machine-facing models ChatGPT GPT-4 most important differentiation is product, not just technology UI / UX marketing customization AI PRODUCTS ARE MORE THAN JUST A MODEL

Slide 43

Slide 43 text

human-facing systems machine-facing models ChatGPT GPT-4 swappable components based on research, impacts are quantifiable most important differentiation is product, not just technology UI / UX marketing customization AI PRODUCTS ARE MORE THAN JUST A MODEL

Slide 44

Slide 44 text

human-facing systems machine-facing models ChatGPT GPT-4 swappable components based on research, impacts are quantifiable most important differentiation is product, not just technology cost speed accuracy latency UI / UX marketing customization AI PRODUCTS ARE MORE THAN JUST A MODEL

Slide 45

Slide 45 text

human-facing systems machine-facing models ChatGPT GPT-4 swappable components based on research, impacts are quantifiable most important differentiation is product, not just technology cost speed accuracy latency UI / UX marketing customization AI PRODUCTS ARE MORE THAN JUST A MODEL But what about the data?

Slide 46

Slide 46 text

human-facing systems machine-facing models ChatGPT GPT-4 swappable components based on research, impacts are quantifiable most important differentiation is product, not just technology cost speed accuracy latency UI / UX marketing customization AI PRODUCTS ARE MORE THAN JUST A MODEL But what about the data? User data is an advantage for product, not the foundation for machine-facing tasks.

Slide 47

Slide 47 text

human-facing systems machine-facing models ChatGPT GPT-4 swappable components based on research, impacts are quantifiable most important differentiation is product, not just technology cost speed accuracy latency UI / UX marketing customization AI PRODUCTS ARE MORE THAN JUST A MODEL But what about the data? User data is an advantage for product, not the foundation for machine-facing tasks. You don’t need specific data to gain general knowledge.

Slide 48

Slide 48 text

human-facing systems machine-facing models ChatGPT GPT-4 swappable components based on research, impacts are quantifiable most important differentiation is product, not just technology cost speed accuracy latency UI / UX marketing customization AI PRODUCTS ARE MORE THAN JUST A MODEL But what about the data? User data is an advantage for product, not the foundation for machine-facing tasks. You don’t need specific data to gain general knowledge.

Slide 49

Slide 49 text

USE CASES IN INDUSTRY predictive tasks ( entity recognition ) relation extraction * coreference resolution # grammar & morphology + semantic parsing % discourse structure , text classification generative tasks " single/multi-doc summarization - reasoning ✅ problem solving ✍ paraphrasing 0 style transfer ⁉ question answering

Slide 50

Slide 50 text

USE CASES IN INDUSTRY predictive tasks ( entity recognition ) relation extraction * coreference resolution # grammar & morphology + semantic parsing % discourse structure , text classification generative tasks " single/multi-doc summarization - reasoning ✅ problem solving ✍ paraphrasing 0 style transfer ⁉ question answering many industry problems have remained the same, they just changed in scale structured data

Slide 51

Slide 51 text

EVOLUTION OF PROBLEM DEFINITIONS

Slide 52

Slide 52 text

rules or instructions ✍ EVOLUTION OF PROBLEM DEFINITIONS

Slide 53

Slide 53 text

programming & rules rules or instructions ✍ EVOLUTION OF PROBLEM DEFINITIONS

Slide 54

Slide 54 text

programming & rules rules or instructions ✍ machine learning examples 2 EVOLUTION OF PROBLEM DEFINITIONS

Slide 55

Slide 55 text

supervised learning programming & rules rules or instructions ✍ machine learning examples 2 EVOLUTION OF PROBLEM DEFINITIONS

Slide 56

Slide 56 text

supervised learning programming & rules rules or instructions ✍ in-context learning rules or instructions ✍ machine learning examples 2 EVOLUTION OF PROBLEM DEFINITIONS

Slide 57

Slide 57 text

supervised learning prompt engineering programming & rules rules or instructions ✍ in-context learning rules or instructions ✍ machine learning examples 2 EVOLUTION OF PROBLEM DEFINITIONS

Slide 58

Slide 58 text

supervised learning prompt engineering programming & rules rules or instructions ✍ in-context learning rules or instructions ✍ machine learning examples 2 EVOLUTION OF PROBLEM DEFINITIONS instructions: human-shaped, easy for non-experts, risk of data drift ✍

Slide 59

Slide 59 text

supervised learning prompt engineering programming & rules rules or instructions ✍ in-context learning rules or instructions ✍ machine learning examples 2 EVOLUTION OF PROBLEM DEFINITIONS instructions: human-shaped, easy for non-experts, risk of data drift ✍ 2 examples: nuanced and intuitive behaviors, specific to use case, labor-intensive

Slide 60

Slide 60 text

supervised learning prompt engineering programming & rules rules or instructions ✍ in-context learning rules or instructions ✍ machine learning examples 2 EVOLUTION OF PROBLEM DEFINITIONS instructions: human-shaped, easy for non-experts, risk of data drift ✍ 2 examples: nuanced and intuitive behaviors, specific to use case, labor-intensive

Slide 61

Slide 61 text

large general- purpose model domain- specific data WORKFLOW EXAMPLE

Slide 62

Slide 62 text

prompting large general- purpose model domain- specific data WORKFLOW EXAMPLE

Slide 63

Slide 63 text

prompting large general- purpose model continuous evaluation baseline domain- specific data WORKFLOW EXAMPLE

Slide 64

Slide 64 text

prompting large general- purpose model continuous evaluation baseline domain- specific data WORKFLOW EXAMPLE iterative model-assisted data annotation

Slide 65

Slide 65 text

prompting large general- purpose model continuous evaluation baseline domain- specific data WORKFLOW EXAMPLE iterative model-assisted data annotation

Slide 66

Slide 66 text

prompting large general- purpose model distilled task- specific model transfer learning continuous evaluation baseline domain- specific data WORKFLOW EXAMPLE iterative model-assisted data annotation

Slide 67

Slide 67 text

prompting large general- purpose model distilled task- specific model transfer learning continuous evaluation baseline distilled model domain- specific data WORKFLOW EXAMPLE iterative model-assisted data annotation

Slide 68

Slide 68 text

processing pipeline prototype PROTOTYPE TO PRODUCTION

Slide 69

Slide 69 text

github.com/explosion/spacy-llm prompt model & transform output to structured data processing pipeline prototype PROTOTYPE TO PRODUCTION

Slide 70

Slide 70 text

processing pipeline in production swap, replace and mix components github.com/explosion/spacy-llm prompt model & transform output to structured data processing pipeline prototype PROTOTYPE TO PRODUCTION

Slide 71

Slide 71 text

processing pipeline in production swap, replace and mix components github.com/explosion/spacy-llm prompt model & transform output to structured data processing pipeline prototype PROTOTYPE TO PRODUCTION

Slide 72

Slide 72 text

processing pipeline in production swap, replace and mix components github.com/explosion/spacy-llm prompt model & transform output to structured data structured machine-facing Doc object processing pipeline prototype PROTOTYPE TO PRODUCTION

Slide 73

Slide 73 text

RESULTS & CASE STUDIES F-Score Speed (words/s) GPT-3.5 1 78.6 < 100 GPT-4 1 83.5 < 100 spaCy 91.6 4,000 Flair 93.1 1,000 SOTA 2023 2 94.6 1,000 SOTA 2003 3 88.8 > 20,000 1. Ashok and Lipton (2023), 2. Wang et al. (2021), 3. Florian et al. (2003) CoNLL 2003 Named Entity Recognition

Slide 74

Slide 74 text

RESULTS & CASE STUDIES F-Score Speed (words/s) GPT-3.5 1 78.6 < 100 GPT-4 1 83.5 < 100 spaCy 91.6 4,000 Flair 93.1 1,000 SOTA 2023 2 94.6 1,000 SOTA 2003 3 88.8 > 20,000 1. Ashok and Lipton (2023), 2. Wang et al. (2021), 3. Florian et al. (2003) SOTA on few- shot prompting RoBERTa-base CoNLL 2003 Named Entity Recognition

Slide 75

Slide 75 text

FabNER Claude 2 accuracy on # of examples 10 20 30 40 50 60 70 80 90 100 0 100 200 300 400 500 20 examples RESULTS & CASE STUDIES F-Score Speed (words/s) GPT-3.5 1 78.6 < 100 GPT-4 1 83.5 < 100 spaCy 91.6 4,000 Flair 93.1 1,000 SOTA 2023 2 94.6 1,000 SOTA 2003 3 88.8 > 20,000 1. Ashok and Lipton (2023), 2. Wang et al. (2021), 3. Florian et al. (2003) SOTA on few- shot prompting RoBERTa-base CoNLL 2003 Named Entity Recognition

Slide 76

Slide 76 text

FabNER Claude 2 accuracy on # of examples 10 20 30 40 50 60 70 80 90 100 0 100 200 300 400 500 20 examples RESULTS & CASE STUDIES F-Score Speed (words/s) GPT-3.5 1 78.6 < 100 GPT-4 1 83.5 < 100 spaCy 91.6 4,000 Flair 93.1 1,000 SOTA 2023 2 94.6 1,000 SOTA 2003 3 88.8 > 20,000 1. Ashok and Lipton (2023), 2. Wang et al. (2021), 3. Florian et al. (2003) SOTA on few- shot prompting RoBERTa-base CoNLL 2003 Named Entity Recognition

Slide 77

Slide 77 text

FabNER Claude 2 accuracy on # of examples 10 20 30 40 50 60 70 80 90 100 0 100 200 300 400 500 20 examples RESULTS & CASE STUDIES says more about crowd worker methodology than LLMs we don’t need crowd workers 3 F-Score Speed (words/s) GPT-3.5 1 78.6 < 100 GPT-4 1 83.5 < 100 spaCy 91.6 4,000 Flair 93.1 1,000 SOTA 2023 2 94.6 1,000 SOTA 2003 3 88.8 > 20,000 1. Ashok and Lipton (2023), 2. Wang et al. (2021), 3. Florian et al. (2003) SOTA on few- shot prompting RoBERTa-base CoNLL 2003 Named Entity Recognition

Slide 78

Slide 78 text

DISTILLED TASK-SPECIFIC COMPONENTS

Slide 79

Slide 79 text

modular DISTILLED TASK-SPECIFIC COMPONENTS

Slide 80

Slide 80 text

modular no lock-in DISTILLED TASK-SPECIFIC COMPONENTS

Slide 81

Slide 81 text

modular testable no lock-in DISTILLED TASK-SPECIFIC COMPONENTS

Slide 82

Slide 82 text

modular testable no lock-in extensible DISTILLED TASK-SPECIFIC COMPONENTS

Slide 83

Slide 83 text

modular testable flexible no lock-in extensible DISTILLED TASK-SPECIFIC COMPONENTS

Slide 84

Slide 84 text

modular testable flexible no lock-in extensible cheap to run DISTILLED TASK-SPECIFIC COMPONENTS

Slide 85

Slide 85 text

modular testable flexible no lock-in extensible run in-house cheap to run DISTILLED TASK-SPECIFIC COMPONENTS

Slide 86

Slide 86 text

modular testable flexible no lock-in programmable extensible run in-house cheap to run DISTILLED TASK-SPECIFIC COMPONENTS

Slide 87

Slide 87 text

modular testable flexible predictable no lock-in programmable extensible run in-house cheap to run DISTILLED TASK-SPECIFIC COMPONENTS

Slide 88

Slide 88 text

modular testable flexible predictable transparent no lock-in programmable extensible run in-house cheap to run DISTILLED TASK-SPECIFIC COMPONENTS

Slide 89

Slide 89 text

modular testable flexible predictable transparent no lock-in programmable extensible run in-house cheap to run DISTILLED TASK-SPECIFIC COMPONENTS

Slide 90

Slide 90 text

control resource regulation compounding economies of scale network effects MONOPOLY STRATEGIES

Slide 91

Slide 91 text

control resource regulation compounding economies of scale network effects MONOPOLY STRATEGIES

Slide 92

Slide 92 text

control resource regulation compounding economies of scale network effects MONOPOLY STRATEGIES

Slide 93

Slide 93 text

control resource regulation compounding economies of scale network effects MONOPOLY STRATEGIES human-facing products vs. machine-facing models

Slide 94

Slide 94 text

control resource regulation compounding economies of scale network effects MONOPOLY STRATEGIES human-facing products vs. machine-facing models

Slide 95

Slide 95 text

THE AI REVOLUTION WON’T BE MONOPOLIZED

Slide 96

Slide 96 text

THE AI REVOLUTION WON’T BE MONOPOLIZED The software industry does not run on secret sauce. Knowledge gets shared and published. Secrets won’t give anyone a monopoly.

Slide 97

Slide 97 text

THE AI REVOLUTION WON’T BE MONOPOLIZED The software industry does not run on secret sauce. Knowledge gets shared and published. Secrets won’t give anyone a monopoly. Usage data is great for improving a product, but it doesn’t generalize. Data won’t give anyone a monopoly.

Slide 98

Slide 98 text

THE AI REVOLUTION WON’T BE MONOPOLIZED The software industry does not run on secret sauce. Knowledge gets shared and published. Secrets won’t give anyone a monopoly. LLMs can be one part of a product or process, and swapped for different approaches. Interoperability is the opposite of monopoly. Usage data is great for improving a product, but it doesn’t generalize. Data won’t give anyone a monopoly.

Slide 99

Slide 99 text

THE AI REVOLUTION WON’T BE MONOPOLIZED The software industry does not run on secret sauce. Knowledge gets shared and published. Secrets won’t give anyone a monopoly. LLMs can be one part of a product or process, and swapped for different approaches. Interoperability is the opposite of monopoly. Usage data is great for improving a product, but it doesn’t generalize. Data won’t give anyone a monopoly. Regulation could give someone a monopoly, if we let it. It should focus on products and actions, not components.

Slide 100

Slide 100 text

Explosion spaCy Prodigy Twitter Mastodon Bluesky explosion.ai spacy.io prodigy.ai @_inesmontani @[email protected] @inesmontani.bsky.social LinkedIn