Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
[파이콘 2019] 100억건의 카카오톡 데이터로 똑똑한 일상대화 인공지능 만들기
Search
Junseong
August 18, 2019
Research
7
4.1k
[파이콘 2019] 100억건의 카카오톡 데이터로 똑똑한 일상대화 인공지능 만들기
2019년 한국 파이콘에서 발표된 "100억건의 카카오톡 데이터로 똑똑한 일상대화 인공지능 만들기" 발표 자료입니다.
Junseong
August 18, 2019
Tweet
Share
More Decks by Junseong
See All by Junseong
House Share Platfrom with Senior Citizen 👵🏻🏠 - Junction Asia 2023
codertimo
0
89
Haru - Diary App with Stable Diffusion (JUNCTION 2022)
codertimo
0
1.8k
Imaginar.AI - JUNCTION ASIA 2022 Hackathon
codertimo
0
3.1k
Python Memory Management
codertimo
0
690
All you need is Google's AI Tech: I/O Extended Seoul 2019
codertimo
1
470
[네이버 테크톡] 사람들과 자연스러운 대화를 나누는 인공지능 '핑퐁' 만들기
codertimo
0
440
“머신러닝 엔지니어가 다녀온 F8” 김준성 - F8 2019 Meetup:Seoul
codertimo
0
220
Other Decks in Research
See All in Research
Pythonでジオを使い倒そう! 〜それとFOSS4G Hiroshima 2026のご紹介を少し〜
wata909
0
760
Time to Cash: The Full Stack Breakdown of Modern ATM Attacks
ratatata
0
160
Agentic AIとMCPを利用したサービス作成入門
mickey_kubo
0
630
大域マッチングコスト最小化とLiDAR-IMUタイトカップリングに基づく三次元地図生成 / GLIM @ Robotics symposia 2022
koide3
0
100
20250605_新交通システム推進議連_熊本都市圏「車1割削減、渋滞半減、公共交通2倍」から考える地方都市交通政策
trafficbrain
0
840
Hiding What from Whom? A Critical Review of the History of Programming languages for Music
tomoyanonymous
0
190
EarthSynth: Generating Informative Earth Observation with Diffusion Models
satai
3
360
MIRU2025 チュートリアル講演「ロボット基盤モデルの最前線」
haraduka
15
8.6k
SegEarth-OV: Towards Training-Free Open-Vocabulary Segmentation for Remote Sensing Images
satai
3
300
2021年度-基盤研究B-研究計画調書
trycycle
PRO
0
330
SNLP2025:Can Language Models Reason about Individualistic Human Values and Preferences?
yukizenimoto
0
180
AI in Enterprises - Java and Open Source to the Rescue
ivargrimstad
0
500
Featured
See All Featured
ReactJS: Keep Simple. Everything can be a component!
pedronauck
667
120k
Making the Leap to Tech Lead
cromwellryan
135
9.5k
The Power of CSS Pseudo Elements
geoffreycrofte
79
6k
Sharpening the Axe: The Primacy of Toolmaking
bcantrill
45
2.5k
Chrome DevTools: State of the Union 2024 - Debugging React & Beyond
addyosmani
7
890
Fashionably flexible responsive web design (full day workshop)
malarkey
407
66k
Why You Should Never Use an ORM
jnunemaker
PRO
59
9.6k
Music & Morning Musume
bryan
46
6.8k
A Tale of Four Properties
chriscoyier
160
23k
The Art of Programming - Codeland 2020
erikaheidi
56
14k
Rebuilding a faster, lazier Slack
samanthasiow
84
9.2k
Fantastic passwords and where to find them - at NoRuKo
philnash
52
3.4k
Transcript
PyCon Korea 2019 100রѤ য়స ؘఠ۽ ڙڙೠ ੌ࢚ച ੋҕמ
ٜ݅ӝ झநఠە ೝಯ ӣળࢿ Pingpong AI Research / Junseong Kim
ӣળࢿ ScatterLab / Pingpong AI Research ݠन۞ ূפয - Open-Domain
Conversation AI (ੌ࢚ച ੋҕמ) - Natural Language Understanding (োয ೧) - Open-Domain Question Answering (և ߧਤ ) - ML Research To Production (ݠन۞ ݽ؛ োҳ ࠗఠ ࢲ࠺झച ө) Research Domain Experience - Crazy at NLP! 2.5֙р NLP/ML࠙ঠীࢲ নೠ ҃ਸ ऺҊ णפ! - Pingpong AI Research / ࢎۈۢ োझۣѱ ചೡ ࣻ ח ੋҕמ - Atlas Labs / ࢲ࠺झ ࢚ ୁࠈ, ӝ҅ߣ(NMT) - Naver Clova AI Research Intern / ۨझషی ড ୁࠈ, NLP Research @codertimo / +3K~ stars
[email protected]
ੌ࢚ച ੋҕמۆ ޖੌө য়ט যڃ ঠӝܳ ܖѱ ؼө?
ӝמച ੋҕמۆ ޖੌө? Functional Converstation AI য়ט զॿ ঌ۰! ॆ
ઁ ٜ݅যয? য়ט ഥ 3दীࢲ 4द۽ ߸҃೧ ࢎۈٜ ޙী ೞҊ, ਃೠ ࠗ࠙ਸ ܻೞח AI ѐੋ AI যदझఢ, Ҋёࣃఠ ୁࠈ ١ ࢎۈ ੌਸ न ೧ח ੋҕמ
ੌ࢚ച ੋҕמۆ ޖੌө? নೠ ઁী ೧ ਬ܂ѱ ঠӝೞҊ, ҙ҅ܳ ݛਸ
ࣻ ח ੋҕמ Open-Domain Conversation AI ҳঠ ߣ ߀؈!!! о੭!! ठೠప ରৈب ҡଳই ƕƕ (23, 도라에몽) ҳঠ ژ ࡎঠ?! ਵҳ ߄ࠁ
ӝמച ੋҕמ ӝࣿ դب ࢎਊо যڃ ޙ/ߊചܳ ೡ যוب
ஏ оמೣ ஏ оמೠ ߧਤীࢲ दաܻয়ܳ ٜ݅Ѣա, ݽ؛ਸ ٜ݅ ࣻ
ੌ࢚ച ੋҕמ ցޖ য۰ਕ..// ޖೠೠ /ߊച ߧਤ ੋҕמীѱ Inputਵ۽ ٜযৢ
ޙਸ ࢚ೡ ࣻ হ ࢎਊ х, ࢚క, दр ١ী ٮۄ ޖೠೠ ઁ, ޖೠೠ ޙݓ(Context) ҃ ࣻо ٜ݅য 무한한 질의 범위 (Infinite Query Space)
Ӓۢ ޙઁܳ ಽ ࣻ ח ਬੌೠ ߑߨ..? ߄۽ ٩۞ભ
ਬੌೠ ߑߨ ٩۞ ݽ؛ਸ ࢎۈۢ ೧ೞҊ ೡ ࣻ ب۾ णदఃח Ѫ шդইӝࠗఠ ࣻળ֫ ചܳ ೞӝ ਤ೧ ਃೠ ࣻ֙р ച҃ ਃ۽ೣ
ܻীѱח ؘఠо য! ݽٚ ؘఠח ࢎਊ زܳ ߉Ҋ ࢎਊীѱ
ઁҕ߉ওਵݴ ѐੋࠁ ध߹ ࠛоמೠ ؘఠ݅ োҳ ѐߊ ݾਵ۽݅ ࢎਊ غҊ णפ. ೠҴয 100রѤ స ؘఠ, ੌࠄয 2রѤ ۄੋ ؘఠ
ചܳ ೧೧ࠁ! Keypoint! ৬ ޙݓਸ ੜ ೧ೞח ݽ؛ Natural Language
Understanding : NLU ݽ؛
য ೧۱ ࢎۈ যڌѱ যী ೠ ೧۱ਸ ఃਕ աоחо? যܽইо
ࣻמ Ҵয ޙઁܳ ಽ ޅೞח ਬח ಹח పץ হযࢲ ӝب ೞ݅ যী ೠ Ӕࠄੋ ೧о ࠗೞӝ ٸޙ
য ೧۱ ࢎۈ যڌѱ যী ೠ ೧۱ਸ ఃਕ աоחо? ച
೧ ݽ؛ ण
য ೧۱ ࢎۈ যڌѱ যী ೠ ೧۱ਸ ఃਕ աоחо? ೡ
ࣻ ח ݽ؛ (ਊ ޙઁ) ܻ णػ ച ೧ ݽ؛
য ೧۱ ࢎۈ যڌѱ যী ೠ ೧۱ਸ ఃਕ աоחо? ೡ
ࣻ ח ݽ؛ (ਊ ޙઁ) ܻ णػ ച ೧ ݽ؛
Әө NLU ҙ۲ োҳ Word2Vec, ELMO ١١ from gensim.models import
Word2Vec model = Word2Vec(...) model.most_similar(positive=[“৴", “ৈࢿ"], negative=[“թࢿ”]) # >> (“ৈ৴”, 0.0332387343) ߸ ױযٜр ਤܳ ഝਊೞৈ Skip-Gram ߑधਵ۽ ण Gensim ۄ࠳۞ܻܳ ਊ೧ࢲ ݆ ѐߊٜীѱ ࢎی߉ Word2Vec (ױয ױਤ Representation) Mikolov, Tomas, et al. "Distributed representations of words and phrases and their compositionality." Advances in neural information processing systems. 2013.
Әө NLU োҳ Word2Vec, ELMO ١١ from gensim.models import Word2Vec
model = Word2Vec(...) model.most_similar(positive=[“৴", “ৈࢿ"], negative=[“թࢿ”]) # >> (“ৈ৴”, 0.0332387343) class ELMo(nn.Module): def __init__(self): super().__init__() self.word_embed = nn.Embedding(VOCAB_SIZE, 1024) self.encoder = nn.LSTM(1024, bidirectional=True) self.pretrain_proj = nn.Linear(1024, VOCAB_SIZE) def forward(self, input_seq: torch.Tensor) -> torch.Tensor: embed = self.word_embed.forward(input_seq) return self.encoder.forward(embed) def pretrain(self, input_seq: torch.Tensor) -> torch.Tensor: encoded = self.forward(input_seq) return self.pretrain_proj.forward(encoded) ߸ ױযٜр ਤܳ ഝਊೞৈ Skip-Gram ߑधਵ۽ ण Gensim ۄ࠳۞ܻܳ ਊ೧ࢲ ݆ ѐߊٜীѱ ࢎی߉ BI-LSTM ݽ؛ҳઑܳ ਊ೧ ױযܳ ஏೞب۾ ण Word2Vec (ױয ױਤ Representation) ELMo (ޙ ױਤ Representation) Mikolov, Tomas, et al. "Distributed representations of words and phrases and their compositionality." Advances in neural information processing systems. 2013. Deep contextualized word representations Matthew E. Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, Luke Zettlemoyer. NAACL 2018.
ೞ݅ ցޖա ࠗೠ NLU ࢿמ ӝઓ NLU ޙઁ ױ п
ޙী ೧ Өѱ ೧ೞ ޅೞҊ
ӝઓ NLU ޙઁ ױ ೧೧ঠ ೡ ޙ ӡо ӡࣻ۾ ೧بо
әѺೞѱ ڄয п ޙী ೧ Өѱ ೧ೞ ޅೞҊ ೞ݅ ցޖա ࠗೠ NLU ࢿמ
ӝઓ NLU ޙઁ ױ ೧೧ঠ ೡ ޙ ӡо ӡࣻ۾ ೧بо
әѺೞѱ ڄয ച ޙݓ(Dialog-Context) ഃ ೧ೞҊ ঋ ೞ݅ ցޖա ࠗೠ NLU ࢿמ п ޙী ೧ Өѱ ೧ೞ ޅೞҊ
BERT Transformer 2018֙ 10ਘ
BERT Bidirectional Encoder Representation from Transformer 2018֙ 10ਘ
BERT BERT 2018֙ 10ਘ
BERT BERT 2018֙ 10ਘ
ݽٚ ޙઁܳ ೠߣী ೧Ѿ೧ࠁ NLU ୭ъ : BERT ࢎۈ ೧۱ਸ
ֈযࢶ NLU ݽ؛ ݃झఠ *SQuAD ޙઁীࢲ ࢎۈࠁ ֫ f1 scoreܳ ׳ࢿ೮ 11ѐ NLP పझীࢲ ݽف SOTA(State-Of-The-Art)ܳ ନযߡܿ
BERT ݽ؛ ण द ୭Ҋ NLU ݽ؛ਸ ٜ݅য ࠁ 1.
Self-Attentionਸ ৈ۞ க ऺѱ غݶ ؊ ࠂೠ ࢚ҙҙ҅ Ө ೧о оמೣ 2. ױযٜ ҙ҅ܳ ઙਵ۽ ৈ۞ߣ ࠁӝ ٸޙী ਊա ী х x 3. ӟ ޙীب ਬো೧ࢲ ച ޙݓ ࠁ द ബҗਵ۽ ೧ೡ ࣻ 4. Masked LM ۄח рױೠ ߑध য ೧ / ೞ݅ Ө ೧ 5. Next Sentence Prediction ݽ؛ਸ ా೧ࢲ ޙݓী ೠ ೧ ୶о ച NLU ݽ؛۽ BERTܳ ࢎਊೞѱ غݶ ח
BERT ݽ؛ ण द ୭Ҋ NLU ݽ؛ਸ ٜ݅য ࠁ ۧѱ
ੜೞח ݽ؛ חؘ पਸ উ೧ࠅ ࣻ হ! ചܳ ਤೠ BERTפ Dialog-BERT ۽ ܴਸ ࢎਊ೧ ࠁ
1. ࠼ ݏ୶ӝ ޙઁ: Masked Language Modeling Dialog-BERT ण ߑߨ
ߣী оր? эоप? ে ցب оջ? Ӓې э о ߣী ߊ ߀חѢ ݆؍ؘ դ ળࢿש ߊ ٜਸԄؘ ƀƀ াƀƀ աب ߅ ӝ ױযীࢲ 15%ܳ ےؒೞѱ ઁೞҊ, ೧ ܻী ઁػ ױযо ޥ ݏ୶ӝ ച ޙݓҗ ೧ ച ߸ ױযٜр ҙ҅ܳ ਬ୶೧ࢲ ݏঠೣ
ߣী оր? эоप? ে ցب оջ? Ӓې э о
ߣী ߊ ߀חѢ ݆؍ؘ դ ળࢿש ߊ ٜਸԄؘ ƀƀ াƀƀ աب ߅ ӝ ױযীࢲ 15%ܳ ےؒೞѱ ઁೞҊ, ೧ ܻী ઁػ ױযо ޥ ݏ୶ӝ ച ޙݓҗ ೧ ച ߸ ױযٜр ҙ҅ܳ ਬ୶೧ࢲ ݏঠೣ 1. ࠼ ݏ୶ӝ ޙઁ: Masked Language Modeling Dialog-BERT ण ߑߨ
ߣী оր? эоप? ে ցب оջ? Ӓې э о
ߣী ߊ ߀חѢ ݆؍ؘ դ ળࢿש ߊ ٜਸԄؘ ƀƀ াƀƀ աب ߅ ӝ ױযীࢲ 15%ܳ ےؒೞѱ ઁೞҊ, ೧ ܻী ઁػ ױযо ޥ ݏ୶ӝ ച ޙݓҗ ೧ ച ߸ ױযٜр ҙ҅ܳ ਬ୶೧ࢲ ݏঠೣ 1. ࠼ ݏ୶ӝ ޙઁ: Masked Language Modeling Dialog-BERT ण ߑߨ
2. োࣘ ച ৈࠗ Classification: Next Sentence Prediction ߣী ৬झ
࢜۽ աৡ नҋ ޭ࠺ ࠌয? ো ࠌ ৬ ߅؊ۄ ӒפӬ ƕƕƕ ߣ ֢ې ୭Ҋঠ അ ޙݓীࢲ য ޙ য ࣻ ח णೞח ߑध ܳ ా೧ࢲ োࣘػ ചীࢲ োझ۞ ޙݓ ޖੋ োझۣѱ ೧ೣ Input Context Dialog-BERT ण ߑߨ
2. োࣘ ച ৈࠗ Classification: Next Sentence Prediction ߣী ৬झ
࢜۽ աৡ नҋ ޭ࠺ ࠌয? ো ࠌ ৬ ߅؊ۄ ӒפӬ ƕƕƕ ߣ ֢ې ୭Ҋঠ ݡѢ যݠפԋ ݈ॹ٘۷য? അ ޙݓীࢲ য ޙ য ࣻ ח णೞח ߑध ܳ ా೧ࢲ োࣘػ ചীࢲ োझ۞ ޙݓ ޖੋ োझۣѱ ೧ೣ Input Context ޙݓী ݏ ঋח - Target 0 Dialog-BERT ण ߑߨ
2. োࣘ ച ৈࠗ Classification: Next Sentence Prediction ߣী ৬झ
࢜۽ աৡ नҋ ޭ࠺ ࠌয? ো ࠌ ৬ ߅؊ۄ ӒפӬ ƕƕƕ ߣ ֢ې ୭Ҋঠ Ӓېࢲ ա ߣী চߧ ࢎ۰Ҋ!! അ ޙݓীࢲ য ޙ য ࣻ ח णೞח ߑध ܳ ా೧ࢲ োࣘػ ചীࢲ োझ۞ ޙݓ ޖੋ োझۣѱ ೧ೣ Input Context ޙݓী ݏח - Target 1 Dialog-BERT ण ߑߨ
ചदझమী ݏח BERT ҳઑ ߸҃ Dialog-BERT ݽ؛ ҳઑ ೞ݅ ӝઓ
ݽ؛ ҳઑח ചदझమী ୭ച غ ঋ ঠ ਫ਼݅ Ә ੌլ֎ ƀƀ ޥؘ Ӓې цӝ?? ীо ަ فҊ ৳য ƀƀ User A TURN 1 User A TURN 3 User B TURN 2 Original BERT ҳઑ ӝઓ ݽ؛ীח п ఢਸ ҳ࠙ೡ ࣻ ח ࠁо হযࢲ ఢਸ ҳ࠙ೞӝ য۰ਛ Dialog-BERTীࢲח Turn Embeddingਸ ୶о೧ п ఢ߹ ࠁܳ ֍
ݽ؛ ण BERTܳ ച ؘఠ۽ ण೧ ࠁ! ҳӖ google-research/bert ۨನܳ
ਊ೧ࢲ BERT NLU ݽ؛ ण with Tensorflow ೂࠗೠ ചܳ ೞח ࢎਊܳ Ҏۄ 10রѤ ച ؘఠ ࢎਊ Google TPU-v3 ܳ ࢎਊ೧ࢲ ড 18ੌр ण
Pre-training Result BERTܳ ച ؘఠ۽ ण೧ ࠁ! Next Sentence Prediction
: 88.4% ACC / Masked Language Modeling : 53.6% ACC
Downstream Task Fine-Tuning BERTܳ ച ؘఠ۽ ण೧ ࠁ! ӝઓী ELMo
ա NLU ݽ؛ হ ण दௌਸ ٸ ࠁ ਘ١ ֫ ࢿמ ೱ࢚ਸ ࠁৈ ೧۱ ݒ ࣻೠ Dialog-BERT ݽ؛ਸ ٜ݅ ࣻ 80% 85% 90% 95% 100% زੌ ޙ ҳ࠙ ߸ ݒட ݽ؛ ب ࠙ܨ Base-Line BERT
ਸ ೧ࠁ! ڪযդ ೧۱ਸ ޖӝ۽ ы! ઁ ݈ਸ ೡ ࣻ
ب۾ оܰࠁ
যڌѱ ޖೠೠ ҃ী ࣻী ೡ ࣻ ਸө? ޖೠೠ ন ߊചী
غח ߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃ ࣻ input য ೧
যڌѱ ޖೠೠ ҃ী ࣻী ೡ ࣻ ਸө? ޖೠೠ ন ߊചী
غח ߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃ ࣻ input য ೧ ޖೠೠ ߸ ҃ ࣻ
যڌѱ ޖೠೠ ҃ী ࣻী ೡ ࣻ ਸө? ޖೠೠ ন ߊചী
غח ߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃ ࣻ input য ೧ ਬೠѐ ߸ ߑߨ
যڌѱ ޖೠೠ ҃ী ࣻী ೡ ࣻ ਸө? ޖೠೠ ন ߊചী
غח ߸ਸ ٜ݅যঠ ೣ! ޖೠѐ ޙٜী ೧ ਬೠѐ ߸ ݃ա ݆ࣻ ޙਸ ݃ա ೠ ߸ਵ۽ ழߡೡ ࣻ ח ৈࠗ Test Coverage -> ച Coverage
ࢎۈٜ ۽ ࢎਊೞח ߊചо ަө? ޖೠೠ ন ߊചী غח ߸ਸ
ٜ݅যঠ ೣ! पઁ ച ؘఠܳ ٍઉࢲ ച ழߡܻܳ ӓചೡ ࣻ ח ߸ٜਸ ইࠁ
ച ݫदܳ ࠼بࣻ ӝળਵ۽ ࢚ਤ ۘఊ ‘ܻঘ࣌’ী оө ݈ٜ ݆ও
TOP 10000ѐ ޙ 8000݅ѐ ݫࣁ 21.87%(1700݅ߣ)ܳ ର೮ ইפীਃ ੜਃ ݆ݡয ਃ ੜ೮যਃ উ೧ਃ ӈৈਕ ઔ۰ ߏݡয ࢎی೧ਃ ࠗ۞ਕ Ѣ݈ ݍѱݡয ߅ ҡଳই য٥ؘ աبաب ইب नӝ೧ ઁ ־ҳی ঌѷযਃ पઁ ച ࠙ࢳ ࢎۈٜ ۽ যڃ ݈ٜਸ ݆ ೡө?
ܻঘ࣌ ܻח ݆ ചܳ ܻঘ࣌ਸ ਊ೧ࢲ ܻೣ : ܻঘ࣌݅ ੜ೧ب
{թ|ৈ}ҳীѱ ࢄ߉ח ചо оמೣ ചীࢲ ܻঘ࣌ নೠ ࢚ടীࢲ ҭ ਬਊೠ ച ߑߨ ܻח ܻঘ࣌ਸ যڌѱ ࢎਊೞחо?
݅ড AIо যڃ ޙীٚ ܻঘ࣌ਸ ೡ ࣻ ݶ..? ܻঘ࣌ਸ ా೧
߸ ߧਤܳ ગݶࢲب query coverageܳ ӓച ೡ ࣻ ! ӝઓ दझమٜী ࠺೧ ষաѱ Query Coverageܳ טܾ ࣻ ! ࢎਊ ޖೠೠ ޙী য ށۄೞ؍ ࠈ ઁ যו ب ݏ߉ই ச ࣻ ח מ۱ਸ ષ ܻঘ࣌ ݽ؛ਸ ٜ݅ѱ ػݶ!!
যڌѱ ޖೠೠ ҃ী ࣻী ೡ ࣻ ਸө? ޖೠೠ ন ߊചী
غח ߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃ ࣻ input য ೧ ਬೠѐ ߸ ߑߨ
যڌѱ ޖೠೠ ҃ী ࣻী ೡ ࣻ ਸө? ޖೠೠ ন ߊചী
غח ߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃ ࣻ input য ೧ ਬೠѐ ߸ ߑߨ Reaction Model ܻঘ࣌ ݽ؛
ܻঘ࣌ ݽ؛ਸ ٜ݅যࠁ! ࢚ਤ ߸ਸ Reaction Class۽ ٜ݅Ҋ, ࠙ܨ ޙઁ۽
ಽযࠁ! ড 5ఢ ب ޙݓਸ inputਵ۽ Ҋ, যڃ ܻঘ࣌ਸ ࢎਊೡ ஏೞب۾ णਸ दெࠁ!
ܻঘ࣌ ݽ؛ਸ ٜ݅যࠁ! যઁ ٜ݅؍Ѥ ٜ݅য? | ইਃ |
ইب | ֎.. | ઁ ߊܐ Ӓ݅ ٜ݅Ҋ оࢲ ঌѷযਃ ि ӒѢ যڌѱ ೠҊ ೮ભ? | ղо ઙী о ॄְয! | ࠠ Ѣ ॳݶ غભ? | ӒѢ ॳݶ ؼԁঠ ঌѷযਃ য়ט ইஜࠗఠ न হ֎ | ই | ই ߓҊ աب ਃ્ ൨ٜয ࠁੋ | ҅ࣘ ঠӔ೧оҳ | оՔঀ рী ખ Ҋ रয աب ઁ১? | 12दତ? | ೳ ן֎ ߓҊѷ | ƕƕ | ঠध दெݡਸө? જই ঠ ա য়טࠗఠ য | աبաب | ցب э زтې? જই ࢚ਤ ߸ਸ Reaction Class۽ ٜ݅Ҋ, ࠙ܨ ޙઁ۽ ಽযࠁ! ড 5ఢ ب ޙݓਸ inputਵ۽ Ҋ, যڃ ܻঘ࣌ਸ ࢎਊೡ ஏೞب۾ णਸ दெࠁ! … ড 1000݅Ѥ+ ܻঘ࣌ ण ؘఠࣇ / ষաѱ নೠ ࢚ടҗ ݈ై
ݣ౭ఢ ܻঘ࣌ ݽ؛ Dialog-BERT(NLU)ܳ ਊೠ ୭ࣗೠ ݽ؛ ҳઑ ߸҃ যઁ
ٜ݅؍Ѥ ٜ݅য? | ইਃ | ইب | ֎.. | ઁ ߊܐ Ӓ݅ ٜ݅Ҋ оࢲ ঌѷযਃ (0.72) ֎֎! (0.2) फযਃ (0.12) ..
Pytorchܳ ਊೠ Fine-Tuning BERTܳ ച ؘఠ۽ ण೧ ࠁ! ҳӖ google-research/bert
ۨನܳ ਊ೧ࢲ BERT NLU ݽ؛ ण with Tensorflow ೂࠗೠ ചܳ ೞח ࢎਊܳ Ҏۄ 10রѤ ച ؘఠ ࢎਊ Google TPU-v3 ܳ ࢎਊ೧ࢲ ড 18ੌр ण HuggingFace Pytorch-Transformer ۄ࠳۞ܻ ਊ णػ NLU ݽ؛ਸ ഝਊೠ ѐߊ / ण with pytorch ܻঘ࣌ ݽ؛ fine-tuning ઑण җ V100 GPUܳ ਊ೧ࢲ ೞܖب ण
प۱ ی ೠߣ ೧ࠅө?! ܻঘ࣌ ݽ؛ Ѿҗ
ೠ ߸ ࢶఖਸ ా೧ࢲ ࢎਊ ݈ী ҕхೞҊ ചܳ ਬبೣ নೠ
ࢎਊ ߊചী ೡ ࣻ https://demo.pingpong.us/multi-turn-reaction/ নೠ ࢚ടী ೠ ӝઓ ࠈۄݶ ੌੌ ifޙਵ۽ ॄঠ ೮؍ Ѫٜਸ ঌইࢲ ೣ
ೠ ߸ ࢶఖਸ ా೧ࢲ ࢎਊ ݈ী ҕхೞҊ ചܳ ਬبೣ নೠ
ࢎਊ ߊചী ೡ ࣻ https://demo.pingpong.us/multi-turn-reaction/ ࢎഥੋ ѐ֛ী ೠ ೧ ࢎۈٜ ച҃ਸ Ӓ۽ ޛ۰߉ও
ೠ ߸ ࢶఖਸ ా೧ࢲ ࢎਊ ݈ী ҕхೞҊ ചܳ ਬبೣ নೠ
ࢎਊ ߊചী ೡ ࣻ https://demo.pingpong.us/multi-turn-reaction/ ӟ ޙب ੜ ೧ೞҊ, ҳੋ ߸ ࢚ߑ ޙਸ ݺഛೞѱ ঈೞҊ ҳੋ ޙա ߸ਸ ೡ ࣻ
݄݃ ചܳ “Ӓۡө?”۽ ҊೞҊ, ޙݓ߸ച ޙݓਸ ೧ೞח ߸ https://demo.pingpong.us/multi-turn-reaction/ ޙݓਸ
߈ೞݶࢲ ߸ оמೞ
݄݃ ചܳ “Ӓۡө?”۽ ҊೞҊ, ޙݓ߸ച ޙݓਸ ೧ೞח ߸ https://demo.pingpong.us/multi-turn-reaction/ ޙݓਸ
߈ೞݶࢲ ߸ оמೞ য়! ޙݓ ׳ۄݶ ߸ ׳ۄઉ!
݄݃ ചܳ “Ӓۡө?”۽ ҊೞҊ, ޙݓ߸ച ޙݓਸ ೧ೞח ߸ https://demo.pingpong.us/multi-turn-reaction/ ޙݓਸ
߈ೞݶࢲ ߸ оמೞ য়! ޙݓ ׳ۄݶ ߸ ׳ۄઉ! ਤ۽೧ ࢲ Ҋ݃ਕ ೝಯ!
ޙݓਸ Өѱ ೧ೞҊ ࢎۈۢ ߸ೞח ݽ؛ ৈ۞࠙ب ೠߣ ೧ࠁࣁਃ! https://demo.pingpong.us/multi-turn-reaction/
https://demo.pingpong.us/multi-turn-reaction/
ܻঘ࣌ ݽ؛ਸ ನೣೠ নೠ ചܳ ೡ ࣻ ח ੋҕמ m.me/ai.pingpong
ޅೠ ঠӝ SpeakerDeck : https://bit.ly/2yvWLYR Naver TechTalk: https://youtu.be/T4wjg9_E3K4 Әө ݅ٚ
ݽ؛ٜਸ ࣁೞѱ ٜযࠁप ࣻ णפ! ֎ߡ పస “ࢎۈٜҗ োझ۞ ചܳ ա־ח ੌ࢚ച ੋҕמ ٜ݅ӝ”
খਵ۽ ب೧ աт җઁٜ 1. ߸ ࢤࢿ ӝࣿ (Generation) :
GPT-2۽ ݅ٚ ࣗࢸॳח ੋҕמ ۢ ਬ܂ѱ ചܳ ࢤࢿ(Generation) ೡ ࣻ ח ݽ؛ਸ ٜ݅Ҋ 2. ѐੋച ߸ (User-Background Aware) : ࢎਊ അ х, ߓ҃١ী ٮۄ ߸ਸ ઑೞח ݽ؛ 3. ޙ ߸ച ݽ؛ (Text Style Transfer) : ౠ ޙ(ಕܰࣗա)ܳ ߈ೠ ߸, ࢎਊա நܼఠ ݃ ݈ైо ؘܲ ۠ ౠੋ ݈ైܳ ߈ೠ ߸ਸ ࢤࢿೞח ݽ؛ 4. ؊ ӟ ޙݓ (Longer Context) : Әࠁ ഻न ؊ ӟ ޙݓਸ ߈ೞҊ, ച ઁо ߸҃غਸ ٸ ࡅܰѱ ந೧ࢲ ӝઓ ޙݓਸ ୡӝച ೡ ࣻ ח ݽ؛ 5. ؊ ڙڙೠ, ؊ ܴࣗثח (Humanize) : Ҿӓਵ۽ ؊ ࢎۈэ ੋҕמਸ ٜ݅ӝ ਤೠ ݽٚ োҳٜ
ܻی э ޙઁܳ ಽযࠁ ঋਸېਃ? Ә߄۽ GOGO!! https://scatterlab.co.kr/recruiting Welcome
to ScatterLab, Pingpong AI Research ܻ݅ఀ ੜೞҊ / ੜ ֥Ҋ / ѱ ੌೞח Ҕ হਸԁীਃ ৈ۞࠙ ࢚࢚ೞח Ѫ ࢚ਵ۽ જ Ҋ, ח Ҕۉפ!! ܻ ࠗझب ೞҊ যਃ!! ਊ࢚ೞҊ ҽૉ ߉ਵ۞ য়ࣁਊ!!
хࢎפ ߊա ೝಯ ҙ۲ ޙ ਵदݶ ઁٚ ജ ੑפ