Upgrade to Pro — share decks privately, control downloads, hide ads and more …

文献紹介: Multi-Task Learning for Multiple Language Translation

文献紹介: Multi-Task Learning for Multiple Language Translation

2018/04/20の文献紹介で発表

Yumeto Inaoka

April 20, 2018
Tweet

More Decks by Yumeto Inaoka

Other Decks in Research

Transcript

  1. Multi-Task Learning for Multiple Language Translation Daxiang Dong, Hua Wu,

    Wei He, Dianhai Yu and Haifeng Wang. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics, pages 1723–1732, 2015. 文献紹介 (’18/04/20) 長岡技術科学大学 自然言語処理研究室 稲岡 夢人
  2. Objective Function Θsrc : エンコーダのパラメータ Θ trgTp : T p

    番目の目的言語の デコーダのパラメータ Np : p番目の言語対の対訳数
  3. Training Details • Initialization of all parameters are from uniform

    distribution between -0.01 and 0.01. • We use stochastic gradient descent with recently proposed learning rate decay strategy Ada-Delta (Zeiler, 2012). • Mini batch size in our model is set to 50 so that the convergence speed is fast. • We train 1000 mini batches of data in one language pair before we switch to the next language pair. • For word representation dimensionality, we use 1000 for both source language and target language. • The size of hidden layer is set to 1000.
  4. Model Analysis and Discussion • 単一の対訳で学習させたモデル とマルチタスク学習モデルで Embeddingのコサイン類似度 を計算 •

    数字のような高頻度語はどちら も学習できているが,全体的に はマルチタスクモデルの方が 品質が良い