Upgrade to Pro — share decks privately, control downloads, hide ads and more …

EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation

tkng
October 24, 2015

EMNLP2015読み会:Effective Approaches to Attention-based Neural Machine Translation

tkng

October 24, 2015
Tweet

More Decks by tkng

Other Decks in Research

Transcript

  1. Effective Approaches to Attention-based Neural Machine Translation Authors: Minh-Thang LuongɹHieu

    PhamɹChristopher D. Manning ಡΉਓ: ಙӬ୓೭ ਤ΋શͯ͜ͷ࿦จ͔ΒҾ༻ &./-1ಡΈձ
  2. Seq2seq·ͰͷಓͷΓ (1) • Recurrent Continuous Translation Models (EMNLP2013) • Learning

    Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation (EMNLP2014)
  3. Seq2seq·ͰͷಓͷΓ (2) • Sequence to Sequence Learning with Neural Networks

    (NIPS2014) • ൺֱతγϯϓϧͳStacked LSTM͕ྑ͍ੑೳΛ ࣔ͢͜ͱ͕࣮ݧͰࣔ͞Εͨ • ϏʔϜαʔνɺٯॱͰͷೖྗɺΞϯαϯϒϧ ͷ3छྨͷ޻෉͕ೖ͍ͬͯΔ
  4. ࠓճͷ࿦จͷߩݙ • ৽͍͠attention (local attention) ΛఏҊͨ͠ • ຋༁ݩจʹ͓͍ͯɺҐஔɹ͔ΒલޙD୯ޠ෼ ͷӅΕঢ়ଶͷॏΈ෇͖࿨ΛऔΔ •

    ॏΈͷܭࢉ͸global attentionͷ৔߹ͱಉ༷ • ɹ͸1ͭͣͭਐΊ͍ͯ͘৔߹ʢlocal-mʣ ͱɺ͜Εࣗମ΋RNNʹ͢Δ৔߹ʢlocal- pʣͷ2ͭΛ࣮ݧ͍ͯ͠Δ pt pt