Pro Yearly is on sale from $80 to $50! »

[파이콘 2019] 100억건의 카카오톡 데이터로 
똑똑한 일상대화 인공지능 만들기

B389eb128688072da7124603ef406524?s=47 Junseong
August 18, 2019

[파이콘 2019] 100억건의 카카오톡 데이터로 
똑똑한 일상대화 인공지능 만들기

2019년 한국 파이콘에서 발표된 "100억건의 카카오톡 데이터로 
똑똑한 일상대화 인공지능 만들기" 발표 자료입니다.

B389eb128688072da7124603ef406524?s=128

Junseong

August 18, 2019
Tweet

Transcript

  1. PyCon Korea 2019 100রѤ੄ ஠஠য়స ؘ੉ఠ۽ 
 ڙڙೠ ੌ࢚؀ച ੋҕ૑מ

    ٜ݅ӝ झநఠە ೝಯ౱ ӣળࢿ Pingpong AI Research / Junseong Kim
  2. ӣળࢿ ScatterLab / Pingpong AI Research ݠन۞׬ ূ૑פয - Open-Domain

    Conversation AI (ੌ࢚؀ച ੋҕ૑מ) - Natural Language Understanding (੗োয ੉೧) - Open-Domain Question Answering (և਷ ߧਤ੄ ૕੄਽׹) - ML Research To Production (ݠन۞׬ ݽ؛ োҳ ࠗఠ ࢲ࠺झച ө૑) Research Domain Experience - Crazy at NLP! 2.5֙р NLP/ML࠙ঠীࢲ ׮নೠ ҃೷ਸ ऺҊ ੓णפ׮! - Pingpong AI Research / ࢎۈ୊ۢ ੗োझۣѱ ؀ചೡ ࣻ ੓ח ੋҕ૑מ - Atlas Labs / ࢲ࠺झ ࢚׸ ୁࠈ, ӝ҅ߣ৉(NMT) - Naver Clova AI Research Intern / ۨझషی ৘ড ୁࠈ, NLP Research @codertimo / +3K~ stars codertimo@gmail.com
  3. ੌ࢚؀ച ੋҕ૑מ੉ۆ ޖ঺ੌө য়ט যڃ ੉ঠӝܳ ׮ܖѱ ؼө?

  4. ӝמ؀ച ੋҕ૑מ੉ۆ ޖ঺ੌө? Functional Converstation AI য়ט զॿ ঌ۰઻! ౵੉ॆ਷

    ঱ઁ ୊਺ ٜ݅য઎য? য়ט ഥ੄ 3दীࢲ 4द۽ ߸҃೧઻ ࢎۈٜ੄ ૕ޙী ؀׹ೞҊ, ਃ୒ೠ ࠗ࠙ਸ ୊ܻೞח AI ѐੋ AI যदझఢ౟, Ҋёࣃఠ ୁࠈ ١ ࢎۈ੄ ੌਸ ؀न ೧઱ח ੋҕ૑מ
  5. ੌ࢚؀ച ੋҕ૑מ੉ۆ ޖ঺ੌө? ׮নೠ ઱ઁী ؀೧ ੗ਬ܂ѱ ੉ঠӝೞҊ, ҙ҅ܳ ݛਸ

    ࣻ ੓ח ੋҕ૑מ Open-Domain Conversation AI ૓ҳঠ ੉ߣ ౵੉௑ ੤߀؈!!! о੭!! ੉ठ੉ೠప ରৈب ҡଳই ƕƕ (23, 도라에몽) ૓ҳঠ ژ ࡎ੼੉ঠ?! ਵ੉ҳ ߄ࠁ
  6. ӝמ؀ച ੋҕ૑מ੄ ӝࣿ੸ դ੉ب ࢎਊ੗о যڃ ૕ޙ/ߊചܳ ೡ ૑ যו੿ب

    ৘ஏ оמೣ ৘ஏ оמೠ ߧਤীࢲ दաܻয়ܳ ٜ݅Ѣա, ݽ؛ਸ ٜ݅ ࣻ ੓਺
  7. ੌ࢚؀ച ੋҕ૑מ਷ ցޖ য۰ਕ..// ޖೠೠ ૕੄/ߊച ߧਤ ੋҕ૑מীѱ Inputਵ۽ ٜযৢ

    ૕ޙਸ ৘࢚ೡ ࣻ হ਺ ࢎਊ੗੄ х੿, ࢚క, दр ١ী ٮۄ ޖೠೠ ઱ઁ, ޖೠೠ ޙݓ(Context)੄ ҃਋੄ ࣻо ٜ݅য૗ 무한한 질의 범위 (Infinite Query Space)
  8. Ӓۢ ੉ ޙઁܳ ಽ ࣻ ੓ח ਬੌೠ ߑߨ਷..? ߄۽ ٩۞׬੉ભ

    ਬੌೠ ߑߨ਷ ٩۞׬ ݽ؛ਸ ૓૞ ࢎۈ୊ۢ ੉೧ೞҊ ׹ೡ ࣻ ੓ب۾ ೟णदఃח Ѫ шդইӝࠗఠ ࣻળ֫਷ ؀ചܳ ೞӝ ਤ೧ ೙ਃೠ ࣻ֙р੄ ؀ച҃೷੉ ೙ਃ۽ೣ
  9. ਋ܻীѱח ؘ੉ఠо ੓য! ݽٚ ؘ੉ఠח ࢎਊ੗੄ ز੄ܳ ߉Ҋ ࢎਊ੗ীѱ ૒੽

    ઁҕ߉ওਵݴ 
 ѐੋ੿ࠁ ध߹੉ ࠛоמೠ ؘ੉ఠ݅ োҳ ѐߊ ݾ੸ਵ۽݅ ࢎਊ غҊ ੓णפ׮. ೠҴয 100রѤ੄ ஠స ؘ੉ఠ, ੌࠄয 2রѤ੄ ۄੋ ؘ੉ఠ
  10. ؀ചܳ ੉೧೧ࠁ੗! Keypoint! ੄޷৬ ޙݓਸ ੜ ੉೧ೞח ݽ؛ Natural Language

    Understanding : NLU ݽ؛
  11. ঱য੄ ੉೧۱ ࢎۈ਷ যڌѱ ঱যী ؀ೠ ੉೧۱ਸ ఃਕ աоחо? যܽই੉о

    ࣻמ Ҵয ޙઁܳ ಽ૑ ޅೞח ੉ਬח ಹח ప௼ץ੉ হযࢲ ੉ӝب ೞ૑݅ 
 ঱যী ؀ೠ Ӕࠄ੸ੋ ੉೧о ࠗ઒ೞӝ ٸޙ
  12. ঱য੄ ੉೧۱ ࢎۈ਷ যڌѱ ঱যী ؀ೠ ੉೧۱ਸ ఃਕ աоחо? ؀ച

    ੉೧ ݽ؛ ೟ण
  13. ঱য੄ ੉೧۱ ࢎۈ਷ যڌѱ ঱যী ؀ೠ ੉೧۱ਸ ఃਕ աоחо? ؀׹ೡ

    ࣻ ੓ח ݽ؛ (਽ਊ ޙઁ) ޷ܻ ೟णػ ؀ച ੉೧ ݽ؛
  14. ঱য੄ ੉೧۱ ࢎۈ਷ যڌѱ ঱যী ؀ೠ ੉೧۱ਸ ఃਕ աоחо? ؀׹ೡ

    ࣻ ੓ח ݽ؛ (਽ਊ ޙઁ) ޷ܻ ೟णػ ؀ച ੉೧ ݽ؛
  15. ૑Әө૑੄ NLU ҙ۲ োҳ Word2Vec, ELMO ١١ from gensim.models import

    Word2Vec model = Word2Vec(...) model.most_similar(positive=[“৴", “ৈࢿ"], negative=[“թࢿ”]) # >> (“ৈ৴”, 0.0332387343) ઱߸ ױযٜр੄ ਤ஖ܳ ഝਊೞৈ Skip-Gram ߑधਵ۽ ೟ण Gensim ۄ੉࠳۞ܻܳ ੉ਊ೧ࢲ ݆਷ ѐߊ੗ٜীѱ ࢎی߉਺ Word2Vec (ױয ױਤ Representation) Mikolov, Tomas, et al. "Distributed representations of words and phrases and their compositionality." Advances in neural information processing systems. 2013.
  16. ૑Әө૑੄ NLU োҳ Word2Vec, ELMO ١١ from gensim.models import Word2Vec

    model = Word2Vec(...) model.most_similar(positive=[“৴", “ৈࢿ"], negative=[“թࢿ”]) # >> (“ৈ৴”, 0.0332387343) class ELMo(nn.Module): def __init__(self): super().__init__() self.word_embed = nn.Embedding(VOCAB_SIZE, 1024) self.encoder = nn.LSTM(1024, bidirectional=True) self.pretrain_proj = nn.Linear(1024, VOCAB_SIZE) def forward(self, input_seq: torch.Tensor) -> torch.Tensor: embed = self.word_embed.forward(input_seq) return self.encoder.forward(embed) def pretrain(self, input_seq: torch.Tensor) -> torch.Tensor: encoded = self.forward(input_seq) return self.pretrain_proj.forward(encoded) ઱߸ ױযٜр੄ ਤ஖ܳ ഝਊೞৈ Skip-Gram ߑधਵ۽ ೟ण Gensim ۄ੉࠳۞ܻܳ ੉ਊ೧ࢲ ݆਷ ѐߊ੗ٜীѱ ࢎی߉਺ BI-LSTM ݽ؛ҳઑܳ ੉ਊ೧ ׮਺ ױযܳ ৘ஏೞب۾ ೟ण Word2Vec (ױয ױਤ Representation) ELMo (ޙ੢ ױਤ Representation) Mikolov, Tomas, et al. "Distributed representations of words and phrases and their compositionality." Advances in neural information processing systems. 2013. Deep contextualized word representations Matthew E. Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, Luke Zettlemoyer. NAACL 2018.
  17. ೞ૑݅ ցޖա ࠗ઒ೠ NLU ࢿמ ӝઓ NLU ޙઁ੄ ױ੼ п

    ޙ੢ী ؀೧ Өѱ ੉೧ೞ૑ ޅೞҊ
  18. ӝઓ NLU ޙઁ੄ ױ੼ ੉೧೧ঠ ೡ ޙ੢੄ ӡ੉о ӡࣻ۾ ੉೧بо

    әѺೞѱ ڄয૗ п ޙ੢ী ؀೧ Өѱ ੉೧ೞ૑ ޅೞҊ ೞ૑݅ ցޖա ࠗ઒ೠ NLU ࢿמ
  19. ӝઓ NLU ޙઁ੄ ױ੼ ੉೧೧ঠ ೡ ޙ੢੄ ӡ੉о ӡࣻ۾ ੉೧بо

    әѺೞѱ ڄয૗ ؀ച੄ ޙݓ(Dialog-Context) ਷ ੹ഃ ੉೧ೞҊ ੓૑ ঋ਺ ೞ૑݅ ցޖա ࠗ઒ೠ NLU ࢿמ п ޙ੢ী ؀೧ Өѱ ੉೧ೞ૑ ޅೞҊ
  20. BERT Transformer 2018֙ 10ਘ

  21. BERT Bidirectional Encoder Representation from Transformer 2018֙ 10ਘ

  22. BERT BERT 2018֙ 10ਘ

  23. BERT BERT 2018֙ 10ਘ

  24. ݽٚ ޙઁܳ ೠߣী ೧Ѿ೧ࠁ੗ NLU੄ ୭ъ੗ : BERT ࢎۈ੄ ੉೧۱ਸ

    ֈযࢶ NLU ݽ؛ ݃झఠ *SQuAD ૕੄਽׹ ޙઁীࢲ ࢎۈࠁ׮ ֫਷ f1 scoreܳ ׳ࢿ೮਺ 11ѐ੄ NLP పझ௼ীࢲ ݽف SOTA(State-Of-The-Art)ܳ ନযߡܿ
  25. BERT ݽ؛ ೟ण द੘ ୭Ҋ੄ NLU ݽ؛ਸ ٜ݅য ࠁ੗ 1.

    Self-Attentionਸ ৈ۞ க ऺѱ غݶ ؊ ࠂ੟ೠ ࢚ҙҙ҅ ૊ Ө਷ ੉೧о оמೣ 2. ױযٜ੄ ҙ҅ܳ ઙ೤੸ਵ۽ ৈ۞ߣ ࠁӝ ٸޙী ਊ঱੉ա ୓঱ী ޹х x 3. ӟ ޙ੢ীب ਬো೧ࢲ ؀ച੄ ޙݓ ੿ࠁ ৉द ബҗ੸ਵ۽ ੉೧ೡ ࣻ ੓਺ 4. Masked LM ੉ۄח рױೠ ߑध੄ ঱য ੉೧ / ೞ૑݅ Ө਷ ੉೧ 5. Next Sentence Prediction ݽ؛ਸ ా೧ࢲ ޙݓী ؀ೠ ੉೧ ୶о ؀ച୓ NLU ݽ؛۽ BERTܳ ࢎਊೞѱ غݶ ঳ח ੉੼
  26. BERT ݽ؛ ೟ण द੘ ୭Ҋ੄ NLU ݽ؛ਸ ٜ݅য ࠁ੗ ੉ۧѱ

    ੜೞח ݽ؛੉ ੓חؘ प೷ਸ উ೧ࠅ ࣻ হ૑! ؀ച୓ܳ ਤೠ BERT੉פ Dialog-BERT ۽ ੉ܴਸ ࢎਊ೧ ࠁ੗
  27. 1. ࠼஢ ݏ୶ӝ ޙઁ: Masked Language Modeling Dialog-BERT੄ ೟ण ߑߨ

    ੉ߣ઱ী ౵੉௑ оր? э੉оप? ে ցب оջ? Ӓې э੉ о੗ ੉ߣী ߊ಴ ੤߀חѢ ݆؍ؘ դ ળࢿש ߊ಴ ٜਸԄؘ ƀƀ াƀƀ աب ؀߅ ӝ؀઺੐ ੹୓ ױযীࢲ 15%ܳ ےؒೞѱ ࢏ઁೞҊ, ೧׼ ੗ܻী ࢏ઁػ ױযо ޥ૑ ݏ୶ӝ ੹୓ ؀ച ޙݓҗ ೧׼ ؀ച੄ ઱߸ ױযٜр੄ ҙ҅ܳ ਬ୶೧ࢲ ݏ୾ঠೣ
  28. ੉ߣ઱ী ౵੉௑ оր? э੉оप? ে ցب оջ? Ӓې э੉ о੗

    ੉ߣী ߊ಴ ੤߀חѢ ݆؍ؘ դ ળࢿש ߊ಴ ٜਸԄؘ ƀƀ াƀƀ աب ؀߅ ӝ؀઺੐ ੹୓ ױযীࢲ 15%ܳ ےؒೞѱ ࢏ઁೞҊ, ೧׼ ੗ܻী ࢏ઁػ ױযо ޥ૑ ݏ୶ӝ ੹୓ ؀ച ޙݓҗ ೧׼ ؀ച੄ ઱߸ ױযٜр੄ ҙ҅ܳ ਬ୶೧ࢲ ݏ୾ঠೣ 1. ࠼஢ ݏ୶ӝ ޙઁ: Masked Language Modeling Dialog-BERT੄ ೟ण ߑߨ
  29. ੉ߣ઱ী ౵੉௑ оր? э੉оप? ে ցب оջ? Ӓې э੉ о੗

    ੉ߣী ߊ಴ ੤߀חѢ ݆؍ؘ դ ળࢿש ߊ಴ ٜਸԄؘ ƀƀ াƀƀ աب ؀߅ ӝ؀઺੐ ੹୓ ױযীࢲ 15%ܳ ےؒೞѱ ࢏ઁೞҊ, ೧׼ ੗ܻী ࢏ઁػ ױযо ޥ૑ ݏ୶ӝ ੹୓ ؀ച ޙݓҗ ೧׼ ؀ച੄ ઱߸ ױযٜр੄ ҙ҅ܳ ਬ୶೧ࢲ ݏ୾ঠೣ 1. ࠼஢ ݏ୶ӝ ޙઁ: Masked Language Modeling Dialog-BERT੄ ೟ण ߑߨ
  30. 2. োࣘ ؀ച ৈࠗ Classification: Next Sentence Prediction ੉ߣী ౟৬੉झ

    ࢜۽ աৡ नҋ ޭ࠺ ࠌয? ׼ো൤ ࠌ૑ ৬ ૓૞ ؀߅੉؊ۄ ӒפӬ ƕƕƕ ੉ߣ ֢ې ୭Ҋঠ ૓૞ അ੤ ޙݓীࢲ ઱য૓ ׮਺ ޙ੢੉ ੉য૕ ࣻ ੓ח૑ ೟णೞח ߑध ੉ܳ ా೧ࢲ োࣘػ ؀ചীࢲ ੗োझ۞਍ ޙݓ੉ ޖ঺ੋ૑ ੗োझۣѱ ੉೧ೣ Input Context Dialog-BERT੄ ೟ण ߑߨ
  31. 2. োࣘ ؀ച ৈࠗ Classification: Next Sentence Prediction ੉ߣী ౟৬੉झ

    ࢜۽ աৡ नҋ ޭ࠺ ࠌয? ׼ো൤ ࠌ૑ ৬ ૓૞ ؀߅੉؊ۄ ӒפӬ ƕƕƕ ੉ߣ ֢ې ୭Ҋঠ ૓૞ ݡ਷Ѣ যݠפԋ ݈ॹ٘۷য? അ੤ ޙݓীࢲ ઱য૓ ׮਺ ޙ੢੉ ੉য૕ ࣻ ੓ח૑ ೟णೞח ߑध ੉ܳ ా೧ࢲ োࣘػ ؀ചীࢲ ੗োझ۞਍ ޙݓ੉ ޖ঺ੋ૑ ੗োझۣѱ ੉೧ೣ Input Context ޙݓী ݏ૑ ঋח ؀׹ - Target 0 Dialog-BERT੄ ೟ण ߑߨ
  32. 2. োࣘ ؀ച ৈࠗ Classification: Next Sentence Prediction ੉ߣী ౟৬੉झ

    ࢜۽ աৡ नҋ ޭ࠺ ࠌয? ׼ো൤ ࠌ૑ ৬ ૓૞ ؀߅੉؊ۄ ӒפӬ ƕƕƕ ੉ߣ ֢ې ୭Ҋঠ ૓૞ Ӓېࢲ ա ੉ߣী চߧ ࢎ۰Ҋ!! അ੤ ޙݓীࢲ ઱য૓ ׮਺ ޙ੢੉ ੉য૕ ࣻ ੓ח૑ ೟णೞח ߑध ੉ܳ ా೧ࢲ োࣘػ ؀ചীࢲ ੗োझ۞਍ ޙݓ੉ ޖ঺ੋ૑ ੗োझۣѱ ੉೧ೣ Input Context ޙݓী ݏח ؀׹ - Target 1 Dialog-BERT੄ ೟ण ߑߨ
  33. ؀ചदझమী ݏח BERT ҳઑ ߸҃ Dialog-BERT ݽ؛ ҳઑ ೞ૑݅ ӝઓ

    ݽ؛ ҳઑח ؀ചदझమী ୭੸ച غ૑ ঋ਺ ঠ ਫ਼݅ ૑Ә ௾ੌլ֎ ƀƀ ޥؘ Ӓې ц੗ӝ?? ૘ী׮о ަ فҊ ৳য ƀƀ User A TURN 1 User A TURN 3 User B TURN 2 Original BERT ҳઑ ӝઓ ݽ؛ীח п ఢਸ ҳ࠙ೡ ࣻ ੓ח ੿ࠁо হযࢲ ఢਸ ҳ࠙ೞӝ য۰ਛ਺ Dialog-BERTীࢲח Turn Embeddingਸ ୶о೧ п ఢ߹ ੿ࠁܳ ֍঻਺
  34. ݽ؛੄ ೟ण BERTܳ ؀ച ؘ੉ఠ۽ ೟ण೧ ࠁ੗! ҳӖ੄ google-research/bert ۨನܳ

    ੉ਊ೧ࢲ
 BERT NLU ݽ؛ ೟ण with Tensorflow ೂࠗೠ ؀ചܳ ೞח ࢎਊ੗ܳ Ҏۄ 10রѤ੄ ؀ച ؘ੉ఠ ࢎਊ
 Google TPU-v3 ܳ ࢎਊ೧ࢲ ড 18ੌр ೟ण
  35. Pre-training Result BERTܳ ؀ച ؘ੉ఠ۽ ೟ण೧ ࠁ੗! Next Sentence Prediction

    : 88.4% ACC / Masked Language Modeling : 53.6% ACC
  36. Downstream Task Fine-Tuning BERTܳ ؀ച ؘ੉ఠ۽ ೟ण೧ ࠁ੗! ӝઓী ELMo

    ա NLU ݽ؛ হ੉ ೟ण दௌਸ ٸ ࠁ׮ ਘ١൤ ֫਷ ࢿמ ೱ࢚ਸ ࠁৈ઱঻਺ ੉೧۱੉ ݒ਋ ਋ࣻೠ Dialog-BERT ݽ؛ਸ ٜ݅ ࣻ ੓঻਺ 80% 85% 90% 95% 100% زੌ ޙ੢ ҳ࠙ ׹߸ ݒட ݽ؛ ੄ب ࠙ܨ Base-Line BERT
  37. ؀׹ਸ ೧ࠁ੗! ڪযդ ੉೧۱ਸ ޖӝ۽ ы୿਺! ੉ઁ ݈ਸ ೡ ࣻ

    ੓ب۾ оܰ୛ࠁ੗
  38. যڌѱ ޖೠೠ ҃਋ী ࣻী ؀਽ೡ ࣻ ੓ਸө? ޖೠೠ ন੄ ߊചী

    ؀਽غח ׹߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃਋੄ ࣻ੄ input ঱য ੉೧
  39. যڌѱ ޖೠೠ ҃਋ী ࣻী ؀਽ೡ ࣻ ੓ਸө? ޖೠೠ ন੄ ߊചী

    ؀਽غח ׹߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃਋੄ ࣻ੄ input ঱য ੉೧ ޖೠೠ ׹߸੄ ҃਋੄ ࣻ
  40. যڌѱ ޖೠೠ ҃਋ী ࣻী ؀਽ೡ ࣻ ੓ਸө? ޖೠೠ ন੄ ߊചী

    ؀਽غח ׹߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃਋੄ ࣻ੄ input ঱য ੉೧ ਬೠѐ੄ ׹߸ ߑߨ
  41. যڌѱ ޖೠೠ ҃਋ী ࣻী ؀਽ೡ ࣻ ੓ਸө? ޖೠೠ ন੄ ߊചী

    ؀਽غח ׹߸ਸ ٜ݅যঠ ೣ! ޖೠѐ੄ ૕ޙٜী ؀೧ ਬೠѐ੄ ׹߸੉ ঴݃ա ݆ࣻ਷ ૕ޙਸ ঴݃ա ੸੺ೠ ׹߸ਵ۽ ழߡೡ ࣻ ੓ח ૑ ৈࠗ Test Coverage -> ؀ച Coverage
  42. ࢎۈٜ੉ ઱۽ ࢎਊೞח ߊചо ަө? ޖೠೠ ন੄ ߊചী ؀਽غח ׹߸ਸ

    ٜ݅যঠ ೣ! पઁ ؀ച ؘ੉ఠܳ ٍઉࢲ ؀ച ழߡܻ૑ܳ ӓ؀ചೡ ࣻ ੓ח ׹߸ٜਸ ଺ইࠁ੗
  43. ؀ച ݫद૑ܳ ࠼بࣻ ӝળਵ۽ ࢚ਤ ۘఊ੉ ‘ܻঘ࣌’ী оө਍ ݈ٜ੉ ݆ও਺

    TOP 10000ѐ੄ ޙ੢੉ ੹୓ 8000݅ѐ੄ ݫࣁ૑઺ 21.87%(1700݅ߣ)ܳ ର૑೮਺ ইפীਃ ੜ੗ਃ ݆੉ݡ঻য ૓૞ਃ ੜ೮যਃ ޷উ೧ਃ ӈৈਕ ઔ۰ ߏݡ঻য ࢎی೧ਃ ࠗ۞ਕ Ѣ૙݈ ݍ੓ѱݡয ؀߅ ҡଳই য٥ؘ աبաب ই૒ب नӝ೧ ঱ઁ ־ҳی ঌѷযਃ पઁ ؀ച ࠙ࢳ ࢎۈٜ਷ ઱۽ যڃ ݈ٜਸ ݆੉ ೡө?
  44. ܻঘ࣌ ਋ܻח ݆਷ ؀ചܳ ܻঘ࣌ਸ ੉ਊ೧ࢲ ୊ܻೣ ৘: ܻঘ࣌݅ ੜ೧ب

    {թ੗|ৈ੗}஘ҳীѱ ੉ࢄ߉ח ؀ചо оמೣ ؀ചীࢲ ܻঘ࣌਷ ׮নೠ ࢚ടীࢲ ҭ੢൤ ਬਊೠ ؀ച ߑߨ ਋ܻח ܻঘ࣌ਸ যڌѱ ࢎਊೞחо?
  45. ݅ড AIо যڃ ૕ޙীٚ ܻঘ࣌ਸ ೡ ࣻ ੓׮ݶ..? ܻঘ࣌ਸ ా೧

    ׹߸੄ ߧਤܳ ગ൤ݶࢲب query੄ coverageܳ ӓ؀ച ೡ ࣻ ੓਺! ӝઓ੄ दझమٜী ࠺೧ ষ୒աѱ Query Coverageܳ טܾ ࣻ ੓਺! ࢎਊ੗੄ ޖೠೠ ૕ޙী য૸ ઴ ށۄೞ؍ ࠈ੉ ੉ઁ যו ੿ب ݏ߉ই ச ࣻ ੓ח מ۱ਸ ષ ܻঘ࣌ ݽ؛ਸ ٜ݅ѱ ػ׮ݶ!!
  46. যڌѱ ޖೠೠ ҃਋ী ࣻী ؀਽ೡ ࣻ ੓ਸө? ޖೠೠ ন੄ ߊചী

    ؀਽غח ׹߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃਋੄ ࣻ੄ input ঱য ੉೧ ਬೠѐ੄ ׹߸ ߑߨ
  47. যڌѱ ޖೠೠ ҃਋ী ࣻী ؀਽ೡ ࣻ ੓ਸө? ޖೠೠ ন੄ ߊചী

    ؀਽غח ׹߸ਸ ٜ݅যঠ ೣ! BERT ޖೠೠ ҃਋੄ ࣻ੄ input ঱য ੉೧ ਬೠѐ੄ ׹߸ ߑߨ Reaction Model ܻঘ࣌ ݽ؛
  48. ܻঘ࣌ ݽ؛ਸ ٜ݅যࠁ੗! ࢚ਤ ׹߸ਸ Reaction Class۽ ٜ݅Ҋ, ࠙ܨ ޙઁ۽

    ಽযࠁ੗! ড 5ఢ ੿ب੄ ޙݓਸ inputਵ۽ ઱Ҋ, যڃ ܻঘ࣌ਸ ࢎਊೡ૑ ৘ஏೞب۾ ೟णਸ दெࠁ੗!
  49. ܻঘ࣌ ݽ؛ਸ ٜ݅যࠁ੗! যઁ ٜ݅؍Ѥ ׮ ٜ݅঻য? | ই૒ਃ |

    ೵ ই૒ب | ֎.. | ੉ઁ ߊ಴੗ܐ Ӓ݅ ٜ݅Ҋ оࢲ ੗ ঌѷযਃ ि ӒѢ যڌѱ ೠ׮Ҋ ೮ભ? | ղо ઙ੉ী ׮о ॄְয! | ࠠ ੉Ѣ ॳݶ غભ? | ӒѢ ॳݶ ؼԁঠ ঌѷযਃ য়ט ইஜࠗఠ ੿न੉ হ֎ | ই ૓૞ | ই ૓૞ ߓҊ೐׮ աب ਃ્ ൨ٜয ࠁੋ׮ | ਽ ҅ࣘ ঠӔ೧о૑ҳ | оՔঀ ઺рী ખ ੗Ҋ रয աب ঱ઁ১? | 12दତ? | ೳ ן֎ ߓҊ೐ѷ׮ | ਽ ૓૞ ƕƕ | ঠध दெݡਸө? જই ঠ ա য়טࠗఠ ׮੉য౟੐ | ೵ աبաب | ցب э੉ ਍زтې? જই ࢚ਤ ׹߸ਸ Reaction Class۽ ٜ݅Ҋ, ࠙ܨ ޙઁ۽ ಽযࠁ੗! ড 5ఢ ੿ب੄ ޙݓਸ inputਵ۽ ઱Ҋ, যڃ ܻঘ࣌ਸ ࢎਊೡ૑ ৘ஏೞب۾ ೟णਸ दெࠁ੗! … ড 1000݅Ѥ+ ܻঘ࣌ ೟ण ؘ੉ఠࣇ / ষ୒աѱ ׮নೠ ࢚ടҗ ݈ై
  50. ݣ౭ఢ ܻঘ࣌ ݽ؛ Dialog-BERT(NLU)ܳ ੉ਊೠ ୭ࣗೠ੄ ݽ؛ ҳઑ ߸҃ যઁ

    ٜ݅؍Ѥ ׮ ٜ݅঻য? | ই૒ਃ | ೵ ই૒ب | ֎.. | ੉ઁ ߊ಴੗ܐ Ӓ݅ ٜ݅Ҋ оࢲ ੗ ঌѷযਃ (0.72) ֎֎! (0.2) फযਃ (0.12) ..
  51. Pytorchܳ ੉ਊೠ Fine-Tuning BERTܳ ؀ച ؘ੉ఠ۽ ೟ण೧ ࠁ੗! ҳӖ੄ google-research/bert

    ۨನܳ ੉ਊ೧ࢲ
 BERT NLU ݽ؛ ೟ण with Tensorflow ೂࠗೠ ؀ചܳ ೞח ࢎਊ੗ܳ Ҏۄ 10রѤ੄ ؀ച ؘ੉ఠ ࢎਊ
 Google TPU-v3 ܳ ࢎਊ೧ࢲ ড 18ੌр ೟ण HuggingFace੄ Pytorch-Transformer ۄ੉࠳۞ܻ ੉ਊ ೟णػ NLU ݽ؛ਸ ഝਊೠ ѐߊ / ೟ण with pytorch ܻঘ࣌ ݽ؛੄ fine-tuning ઑ੿೟ण җ੿ V100 GPUܳ ੉ਊ೧ࢲ ೞܖ੿ب ೟ण
  52. प۱ ੗ی ೠߣ ೧ࠅө?! ܻঘ࣌ ݽ؛ Ѿҗ

  53. ੸੺ೠ ׹߸ ࢶఖਸ ా೧ࢲ ࢎਊ੗੄ ݈ী ҕхೞҊ ؀ചܳ ਬبೣ ׮নೠ

    ࢎਊ੗੄ ߊചী ؀਽ೡ ࣻ ੓਺ https://demo.pingpong.us/multi-turn-reaction/ ׮নೠ ࢚ടী ؀ೠ ؀୊ ӝઓ ࠈ੉ۄݶ ੌੌ੉ ifޙਵ۽ ॄ઻ঠ ೮؍ Ѫٜਸ ঌইࢲ ؀׹ೣ
  54. ੸੺ೠ ׹߸ ࢶఖਸ ా೧ࢲ ࢎਊ੗੄ ݈ী ҕхೞҊ ؀ചܳ ਬبೣ ׮নೠ

    ࢎਊ੗੄ ߊചী ؀਽ೡ ࣻ ੓਺ https://demo.pingpong.us/multi-turn-reaction/ ࢎഥ੸ੋ ѐ֛ী ؀ೠ ੉೧ ࢎۈٜ੄ ؀ച҃೷ਸ Ӓ؀۽ ޛ۰߉ও਺
  55. ੸੺ೠ ׹߸ ࢶఖਸ ా೧ࢲ ࢎਊ੗੄ ݈ী ҕхೞҊ ؀ചܳ ਬبೣ ׮নೠ

    ࢎਊ੗੄ ߊചী ؀਽ೡ ࣻ ੓਺ https://demo.pingpong.us/multi-turn-reaction/ ӟ ޙ੢ب ੜ ੉೧ೞҊ, ҳ୓੸ੋ ׹߸ ࢚؀ߑ੄ ૕ޙਸ ݺഛೞѱ ౵ঈೞҊ ҳ୓੸ੋ ૕ޙ੉ա ׹߸ਸ ೡ ࣻ ੓਺
  56. ݃૑݄ ؀ചܳ “Ӓۡө?”۽ Ҋ੿ೞҊ, ޙݓ߸ച ޙݓਸ ੉೧ೞח ׹߸ https://demo.pingpong.us/multi-turn-reaction/ ޙݓਸ

    ߈৔ೞݶࢲ ׹߸੉ оמೞ׮
  57. ݃૑݄ ؀ചܳ “Ӓۡө?”۽ Ҋ੿ೞҊ, ޙݓ߸ച ޙݓਸ ੉೧ೞח ׹߸ https://demo.pingpong.us/multi-turn-reaction/ ޙݓਸ

    ߈৔ೞݶࢲ ׹߸੉ оמೞ׮ য়! ޙݓ੉ ׳ۄ૑ݶ ׹߸੉ ׳ۄઉ!
  58. ݃૑݄ ؀ചܳ “Ӓۡө?”۽ Ҋ੿ೞҊ, ޙݓ߸ച ޙݓਸ ੉೧ೞח ׹߸ https://demo.pingpong.us/multi-turn-reaction/ ޙݓਸ

    ߈৔ೞݶࢲ ׹߸੉ оמೞ׮ য়! ޙݓ੉ ׳ۄ૑ݶ ׹߸੉ ׳ۄઉ! ਤ۽೧ ઻ࢲ Ҋ݃ਕ ೝಯ!
  59. ޙݓਸ Ө੉੓ѱ ੉೧ೞҊ ࢎۈ୊ۢ ׹߸ೞח ݽ؛ ৈ۞࠙ب ೠߣ ೧ࠁࣁਃ! https://demo.pingpong.us/multi-turn-reaction/

    https://demo.pingpong.us/multi-turn-reaction/
  60. ܻঘ࣌ ݽ؛ਸ ನೣೠ ׮নೠ ؀ചܳ ೡ ࣻ ੓ח ੋҕ૑מ m.me/ai.pingpong

  61. ޅ׮ೠ ੉ঠӝ SpeakerDeck : https://bit.ly/2yvWLYR Naver TechTalk: https://youtu.be/T4wjg9_E3K4 ૑Әө૑ ݅ٚ

    ݽ؛ٜਸ ੗ࣁೞѱ ٜযࠁप ࣻ ੓णפ׮! ֎੉ߡ ప௼స “ࢎۈٜҗ ੗োझ۞਍ ؀ചܳ ա־ח ੌ࢚؀ച ੋҕ૑מ ٜ݅ӝ”
  62. খਵ۽ ب੹೧ աт җઁٜ 1. ׹߸ ࢤࢿ ӝࣿ (Generation) :

    GPT-2۽ ݅ٚ ࣗࢸॳח ੋҕ૑מ ୊ۢ 
 ૓૞ ੗ਬ܂ѱ ؀ചܳ ࢤࢿ(Generation) ೡ ࣻ ੓ח ݽ؛ਸ ٜ݅Ҋ ੓਺ 2. ѐੋച ׹߸ (User-Background Aware) : ࢎਊ੗੄ അ੤ х੿, ߓ҃١ী ٮۄ ׹߸ਸ ઑ੿ೞח ݽ؛ 3. ޙ୓ ߸ച ݽ؛ (Text Style Transfer) : ౠ੿ ޙ୓(ಕܰࣗա)ܳ ߈৔ೠ ׹߸, ࢎਊ੗ա நܼఠ ݃׮ ݈ైо
 ׮ؘܲ ੉۠ ౠ૚੸ੋ ݈ైܳ ߈৔ೠ ׹߸ਸ ࢤࢿೞח ݽ؛ 4. ؊ ӟ ޙݓ (Longer Context) : ૑Әࠁ׮ ഻न ؊ ӟ ޙݓਸ ߈৔ೞҊ, ؀ച੄ ઱ઁо ߸҃غ঻ਸ ٸ ࡅܰѱ 
 ந஖೧ࢲ ӝઓ ޙݓਸ ୡӝച ೡ ࣻ ੓ח ݽ؛ 5. ؊ ڙڙೠ, ؊ ܴࣗثח (Humanize) : Ҿӓ੸ਵ۽ ؊ ࢎۈэ਷ ੋҕ૑מਸ ٜ݅ӝ ਤೠ ݽٚ োҳٜ
  63. ਋ܻی э੉ ੉ ޙઁܳ ಽযࠁ૑ ঋਸېਃ? ૑Ә߄۽ GOGO!! https://scatterlab.co.kr/recruiting Welcome

    to ScatterLab, Pingpong AI Research ਋ܻ݅ఀ ੜೞҊ / ૓૞ ੜ ֥Ҋ / ੤޷੓ѱ ੌೞח Ҕ হਸԁীਃ ৈ۞࠙੉ ࢚࢚ೞח Ѫ ੉࢚ਵ۽ જ਷ ౱੉Ҋ, ੤޷੓ח Ҕ੉ۉפ׮!! ਋ܻ ࠗझب ೞҊ ੓যਃ!! ଻ਊ࢚׸ೞҊ ҽૉ ߉ਵ۞ য়ࣁਊ!!
  64. хࢎ೤פ׮ ߊ಴ա ೝಯ౱ ҙ۲ ૕ޙ੉ ੓ਵदݶ ঱ઁٚ૑ ؀ജ৔ ੑפ׮