Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Multi-Turn Response Selection for Chatbots with...

Multi-Turn Response Selection for Chatbots with Deep Attention Matching Network

Scatter Lab Inc.

June 05, 2019
Tweet

More Decks by Scatter Lab Inc.

Other Decks in Research

Transcript

  1. 스캐터랩(ScatterLab) ੌ࢚؀ച ੋҕ૑מ Technical Seminar: Multi-Turn Response Selection for Chatbots

    with Deep Attention Matching Network 백영민 Dialogue System Machine Learning Engineer
  2. • ࢎۈ਷ ؀ചೡ ٸ, ৈ۞ ੄ઓҙ҅(੄޷੸, ӝמ੸)ܳ Ҋ۰ೠ׮. !4 Introduction

    #1 Human Conversation A: ցޖ ৮߷೮য ৮੹ ୶ୌ! B: ೵.. ƀƀ աب оҊर׮ ৈ೯૑? ਺ध੼? …
  3. • ࢎۈ਷ ؀ചೡ ٸ, ৈ۞ ੄ઓҙ҅(੄޷੸, ӝמ੸)ܳ Ҋ۰ೠ׮. !5 Introduction

    #1 Human Conversation B: য় Ѣӝ যٿয?? A: ցޖ ৮߷೮য ৮੹ ୶ୌ! B: ೵.. ƀƀ աب оҊर׮ ৈ೯૑? ਺ध੼? …
  4. • ࢎۈ਷ ؀ചೡ ٸ, ৈ۞ ੄ઓҙ҅(੄޷੸, ӝמ੸)ܳ Ҋ۰ೠ׮. !6 Introduction

    #1 Human Conversation A: য়ט ੼बী ஠ಌܚীࢲ ೫ߡѢ ݡ঻য!! B: য় Ѣӝ যٿয?? A: ցޖ ৮߷೮য ৮੹ ୶ୌ! B: ೵.. ƀƀ աب оҊर׮ ਺ध੼!
  5. • ࢎۈ਷ ؀ചೡ ٸ, ৈ۞ ੄ઓҙ҅(੄޷੸, ӝמ੸)ܳ Ҋ۰ೠ׮. !7 Introduction

    #1 Human Conversation A: য়ט ੼बী ஠ಌܚীࢲ ೫ߡѢ ݡ঻য!! B: য় Ѣӝ যٿয?? A: ցޖ ৮߷೮য ৮੹ ୶ୌ! B: ೵.. ƀƀ աب оҊर׮ ਺ध੼! A: Ѣӝ оݶ ԙ ؊࠶ߡѢ ࣁ౟ܳ ݡযঠ೧!
  6. • ؀ചഋ ੋఠಕ੉झীࢲ ࢎۈ੉ ইצ ஹೊఠ(ࠈ)੉ ਬ੷৬ ࣗాೞח ࢲ࠺झ •

    ୭Ӕ োҳ زೱ • Open Domain topic ীࢲ ࢎۈҗ ૑ࣘ੸੉Ҋ ੗োझۣѱ ؀ചೡ ࣻ ੓ח Chatbot • Data Driven Approach: ୷੸ػ ؘ੉ఠܳ ߄ఔਵ۽ ٜ݅য૓ Chatbot • Retrieval-Based • Generation-Based !8 Introduction #1 Conversational AI - Chatbot
  7. • ৈ۞ ؘ੉ఠٜਸ ੉ਊೞৈ ݽ؛ਸ ೟ण • Retrieval-Based Approach: •

    ޷ܻ ੿೧૓ റࠁ ׹߸ٜ ઺ о੢ જ਷ ׹߸ਸ ࢶఖೞ੗! • അ੤ pingpong੉ ࢎਊೞҊ ੓ח ߑध • Single-Turn/Multi-Turn • Generation-Based Approach: • ؀ച ؘ੉ఠٜ۽ ࠗఠ ಁఢਸ ೟णೞৈ ੉ܳ ߄ఔਵ۽ ࢜۽਍ ׹߸ਸ ࢤࢿೞ੗! !9 Introduction #1 Conversational AI - Data Driven Approach
  8. • ࢎۈ਷ ؀ചೡ ٸ, ৈ۞ ੄ઓҙ҅(੄޷੸, ӝמ੸)ܳ Ҋ۰ೠ׮. • ੉੹

    ؀ചীࢲ ৈ۞ ױਤ(ױয, ҳ ޙ੢)ٜী ੄ઓೞৈ ߈਽ೠ׮. • п ੄ઓҙ҅ܳ Ҋ۰ೡ ࣻ ੓ח ݽ؛(chat bot)ਸ ٜ݅੗! • Single-Turn: ੉੹ ߊച݅ ࠁҊ ؀׹ਸ ࢶఖ - context߂ ੄ઓҙ҅ܳ ߈৔ೞӝ ൨ٜ׮ • Multi-Turn: ੉੹ Nѐ੄ ߊചܳ ࠁҊ ؀׹ਸ ࢶఖ - context߂ ੄ઓҙ҅ܳ ߈৔ !10 Introduction #1 Conversational AI - Human Conversation
  9. • ৈ۞ ఢ੄ ߊചٜਸ RNN(Recurrent Neural Network)ܳ ੉ਊೞৈ encoding •

    ׮নೠ ੄ઓҙ҅ܳ ౵ঈೞӝী ੸੺ೞ૑ ঋ׮. - ҳઑ ࢚੄ ೠ҅ • ঩਷ ੄ઓҙ҅(಴ݶ੸ਵ۽ ࠁ੉ח textual relevance - زੌೠ ױয, ਬࢎೠ ױয ١)݅ ౵ঈ оמ • Ө਷ ੄ઓҙ҅(coreference, long-term dependency ١)ী ஂড !13 Recent work #2 RNN(Recurrent Neural Network)
  10. • ౠ੿ ࠗ࠙(ױয, ҳ, ޙ੢ ١)ਸ “૘઺(attend)”ೞৈ Ѿҗܳ ب୹ೞח ߑध

    (Query - Key) • ୡӝীח RNN੄ ޙઁ(long-term dependency)ܳ ೧Ѿ೧઱ӝ ਤ೧ ࢎਊ • “Attention is All you need” - Attention ݅ਵ۽ب(Self-attention) જ਷ Ѿҗܳ յ ࣻ ੓਺ਸ ࠁ੐ • BERT/GPT ١ ୭Ӕ SOTA ݽ؛ٜ੉ ؀ࠗ࠙ ࢎਊ !15 Recent work #2 Attention
  11. !16 Recent work #2 Attention A о E, F, G,

    H ܳ attend ೞৈ ࢜۽਍ A’ ࢤࢿ Query Key
  12. !17 Recent work #2 Attention B о E, F, G,

    H ܳ attend ೞৈ ࢜۽਍ B’ ࢤࢿ Query Key
  13. !18 Recent work #2 Attention C о E, F, G,

    H ܳ attend ೞৈ ࢜۽਍ C’ ࢤࢿ Query Key
  14. • Self-Attention: • Query, Key, Valueܳ ݽف زੌೞѱ ೣ ->

    ੗ӝ ੗नী ؀ೠ attention !19 Recent work #2 Transformer - Self Attention
  15. • Self-Attention: • Query, Key, Valueܳ ݽف زੌೞѱ ೣ ->

    ੗ӝ ੗नী ؀ೠ attention !20 Recent work #2 Transformer - Self Attention A о A, B, C, D ܳ attend ೞৈ ࢜۽਍ A’ ࢤࢿ Query & Key
  16. • Self-Attention: • Query, Key, Valueܳ ݽف زੌೞѱ ೣ ->

    ੗ӝ ੗नী ؀ೠ attention !21 Recent work #2 Transformer - Self Attention B о A, B, C, D ܳ attend ೞৈ ࢜۽਍ B’ ࢤࢿ Query & Key
  17. • Self-Attention: • Query, Key, Valueܳ ݽف زੌೞѱ ೣ ->

    ੗ӝ ੗नী ؀ೠ attention !22 Recent work #2 Transformer - Self Attention C о A, B, C, D ܳ attend ೞৈ ࢜۽਍ C’ ࢤࢿ Query & Key
  18. • ಽҊ੗ೞח ޙઁ: Multi-trun retrieval • Data: Multi-turn ؀ച ؘ੉ఠࣇ

    (c, r, y) • c: n-1ѐ੄ context ߊച • r: response റࠁ • y: label (0, 1) • Ubuntu Corpus V1, Douban Conversational Corpus • g(c, r) -> yܳ ࣻ೯ೞח g(model)ਸ ೟णೞ੗! !27 Model Architecture #3 Problem
  19. !29 Model Architecture #3 Input • Input utterance • Context

    utterance: • Reply utterance: • Embedding: • п ױযٜী ؀೧ d(=200)ରਗ embedding • Pre-trained word2vec ੉ਊ ui = [wui ,k ]nui −1 k=0 , nui : maxword(context) r = [wui ,k ]nr −1 t=0 , nr : maxword(reply)
  20. !30 Model Architecture #3 Representation • Stacked Self-Attention • L

    ѐ੄ transformer blockਸ ੉ਊ • п transformer outputਸ ੷੢(ա઺ী ੉ਊ) • iߣ૩ context utterance੄ output • Reply utterance੄ output • ׮নೠ ױਤ(ױয, ҳ, ޙ੢)੄ ੄ઓҙ҅ܳ ౵ঈೞӝ ਤೣ • ױয ੄ઓҙ҅ח ࠺Ү੸ ծ਷ layer • ҳ, ޙ੢ ੄ઓҙ҅ח ࠺Ү੸ ֫਷ layer [U0 i , . . . UL i ] [R0, . . . RL]
  21. !40 Result #4 Result - ࠺Ү੸ Turnࣻо ੸ਸ ٸח ੿ഛبо

    ઑӘ ڄয૑૑݅ ੹୓੸ਵ۽ ੌ੿ೞ׮. - ޙ੢੄ ӡ੉о ӡࣻ۾(ನೣೠ ੿ࠁо ݆ਸࣻ۾) stacked layer੄ ബҗܳ ੜ ߉ח׮. - self-attention layerܳ ऺਸ ࣻ۾(~5) ੿ഛبо ֫ই઎׮. - 5ѐ ऺӝ۽ Ѿ੿ - ӡ੉о ૣ਷ utteranceী ؀ೠ ੿ഛبח ծ׮ - о૑Ҋ੓ח ੿ࠁо ੸ӝ ٸޙ
  22. !43 Discussion #4 Vs BERT • п Utterance, Replyܳ زੌೠ

    stacked self- attentionী пп ాҗदఇ - п ޙ੢׼ representationਸ ঳ਸ ࣻ ੓਺ • п Layer Ѿҗܳ ݽف ੉ਊ • (U1,r), (U2,r), (U3,r)…җ੄ attentionਸ ஏ੿ റ ೠߣ ؊ Ѿҗܳ ೤஖ח җ੿(conv 3d)ܳ Ѣஜ - ৈ۞ utterance, replyী ಌઉ੓ח ੄ઓҙ҅ܳ ౵ঈೡ ࣻ ੓ ਸө? (֤ޙীࢲ ୊਺ ઁद೮؍ ޙઁ੼ਸ ೧Ѿೞ૑ ޅೞ ח ו՝..) • ҅࢑۝੉ ੸਺ + RMMҗ э਷ modelਸ ݅ٚ׮ݶ script replyী ؀ೠ representationਸ ޷ܻ ҅࢑ ೧ ֬ਸ ࣻ ੓਺ • ݽٚ Utterance, Replyܳ ೞա੄ inputਵ۽ ೤஖Ҋ BERT inputਵ۽ ࢎਊ - п ޙ੢׼ representationਸ ঳ਸ ࣻ হ਺ • ݃૑݄ Layer੄ Ѿҗ݅ ੉ਊ • ݽٚ Utterance, Replyр੄ ੄ઓҙ҅ܳ attentionਵ ۽ ૒੽੸ਵ۽ modelingೡ ࣻ ੓਺ - ೞ૑݅ ؘ੉ఠ੄ ন੉ ݆૑ ঋ׮ݶ ੗ਬبо ցޖ ֫ইࢲ ೟णೞӝ ൨ٜ ૑ ঋਸө? • ҅࢑۝੉ ݆਺ + script replyী ؀ೠ replyܳ ޷ܻ ݅ ٜ ࣻ হ਺(п ޙ੢׼ representationਸ ঳ਸ ࣻ হਵ޲ ۽) - ࢲ࠺झ द ޙઁо ࢤӡࣻب..? DAM BERT
  23. • Transformerী ؀೧ ׮द ೠ ߣ Өѱ ࢤп೧ࠅ ࣻ ੓ח

    ӝഥ - self attention੄ ӝמী ؀೧ • Multi-turn replyী ؀ೠ ࢜۽਍ दبٜী ؀ೠ ו՝ • അ੤ प೷઺ੋ ߑߨ(BERT)ী ؀೧ غجইࠅ ࣻ ੓঻؍ ӝഥ !45 Discussion #4 ו՛੼