Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Phrase-level Self-Attention Networks for Univer...

katsutan
January 28, 2019

Phrase-level Self-Attention Networks for Universal Sentence Encoding

文献紹介
長岡技術科学大学 勝田 哲弘

http://aclweb.org/anthology/D18-1408

katsutan

January 28, 2019
Tweet

More Decks by katsutan

Other Decks in Technology

Transcript

  1. Phrase-level Self-Attention Networks for Universal Sentence Encoding Wei Wu, Houfeng

    Wang, Tianyu Liu, Shuming Ma Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3729–3738 Brussels, Belgium, 2018. 文献紹介 長岡技術科学大学 勝田 哲弘
  2. Abstract • Phrase-level SelfAttention Networks (PSAN)を提案 • フレーズで自己注意を行うため、メモリ消費が少ない • gated

    memory updating mechanismでツリー構造を組み込 むことで階層的に単語表現を学習できる • 少ないメモリで様々なタスクでSotAを達成