୯ޠஔ & ڭࢣͳ͠MTͷΈ߹ΘͤͰߋʹੑೳ্ September 28, 2019 RIKEN AIP / Inui-Suzuki Laboratory 15 ୯ޠ୯Ґͷஔ͖͍͑͢͝ Training Data BLEU for X)ENG AZE BEL GLG SLK (TUR) (RUS) (POR) (CES) Results from Literature SDE (Wang et al., 2019) 12.89 18.71 31.16 29.16 many-to-many (Aharoni et al., 2019) 12.78 21.73 30.65 29.54 Standard NMT 1 {SLE SHE , TLE THE } (supervised MT) 11.83 16.34 29.51 28.12 2 {ML , ME } (unsupervised MT) 0.47 0.18 1.15 0.75 Standard Supervised Back-translation 3 + { ˆ Ss E )L , ME } 11.84 15.72 29.19 29.79 4 + { ˆ Ss E )H , ME } 12.46 16.40 30.07 30.60 Augmentation from HRL-ENG 5 + { ˆ Ss H )L , THE } (supervised MT) 11.92 15.79 29.91 28.52 6 + { ˆ Su H )L , THE } (unsupervised MT) 11.86 13.83 29.80 28.69 7 + { ˆ Sw H )L , THE } (word subst.) 14.87 23.56 32.02 29.60 8 + { ˆ Sm H )L , THE } (modified UMT) 14.72 23.31 32.27 29.55 9 + { ˆ Sw H )L ˆ Sm H )L , THE THE } 15.24 24.25 32.30 30.00 Augmention from ENG by pivoting 10 + { ˆ Sw E )H )L , ME } (word subst.) 14.18 21.74 31.72 30.90 11 + { ˆ Sm E )H )L , ME } (modified UMT) 13.71 19.94 31.39 30.22 Combinations 12 + { ˆ Sw H )L ˆ Sw E )H )L , THE ME } (word subst.) 15.74 24.51 33.16 32.07 13 + { ˆ Sw H )L ˆ Sm H )L , THE THE } 15.91 23.69 32.55 31.58 Training Data BLEU for X)ENG AZE BEL GLG SLK (TUR) (RUS) (POR) (CES) Results from Literature SDE (Wang et al., 2019) 12.89 18.71 31.16 29.16 many-to-many (Aharoni et al., 2019) 12.78 21.73 30.65 29.54 Standard NMT 1 {SLE SHE , TLE THE } (supervised MT) 11.83 16.34 29.51 28.12 2 {ML , ME } (unsupervised MT) 0.47 0.18 1.15 0.75 Standard Supervised Back-translation 3 + { ˆ Ss E )L , ME } 11.84 15.72 29.19 29.79 4 + { ˆ Ss E )H , ME } 12.46 16.40 30.07 30.60 Augmentation from HRL-ENG 5 + { ˆ Ss H )L , THE } (supervised MT) 11.92 15.79 29.91 28.52 6 + { ˆ Su H )L , THE } (unsupervised MT) 11.86 13.83 29.80 28.69 7 + { ˆ Sw H )L , THE } (word subst.) 14.87 23.56 32.02 29.60 8 + { ˆ Sm H )L , THE } (modified UMT) 14.72 23.31 32.27 29.55 9 + { ˆ Sw H )L ˆ Sm H )L , THE THE } 15.24 24.25 32.30 30.00 Augmention from ENG by pivoting 10 + { ˆ Sw E )H )L , ME } (word subst.) 14.18 21.74 31.72 30.90 11 + { ˆ Sm E )H )L , ME } (modified UMT) 13.71 19.94 31.39 30.22 Combinations SDE (Wang et al., 2019) 12.89 18.71 31.16 29.16 many-to-many (Aharoni et al., 2019) 12.78 21.73 30.65 29.54 Standard NMT 1 {SLE SHE , TLE THE } (supervised MT) 11.83 16.34 29.51 28.12 2 {ML , ME } (unsupervised MT) 0.47 0.18 1.15 0.75 Standard Supervised Back-translation 3 + { ˆ Ss E )L , ME } 11.84 15.72 29.19 29.79 4 + { ˆ Ss E )H , ME } 12.46 16.40 30.07 30.60 Augmentation from HRL-ENG 5 + { ˆ Ss H )L , THE } (supervised MT) 11.92 15.79 29.91 28.52 6 + { ˆ Su H )L , THE } (unsupervised MT) 11.86 13.83 29.80 28.69 7 + { ˆ Sw H )L , THE } (word subst.) 14.87 23.56 32.02 29.60 8 + { ˆ Sm H )L , THE } (modified UMT) 14.72 23.31 32.27 29.55 9 + { ˆ Sw H )L ˆ Sm H )L , THE THE } 15.24 24.25 32.30 30.00 Augmention from ENG by pivoting 10 + { ˆ Sw E )H )L , ME } (word subst.) 14.18 21.74 31.72 30.90 11 + { ˆ Sm E )H )L , ME } (modified UMT) 13.71 19.94 31.39 30.22 Combinations 12 + { ˆ Sw H )L ˆ Sw E )H )L , THE ME } (word subst.) 15.74 24.51 33.16 32.07 13 + { ˆ Sw H )L ˆ Sm H )L , THE THE } 15.91 23.69 32.55 31.58 + { ˆ Sw E )H )L ˆ Sm E )H )L , ME ME } Table 2: Evaluation of translation performance over four language pairs. Rows 1 and 2 show pre-training BLEU