NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1FBAIALT2my-en2019/07/27 14:36:583201---38.59------NMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
2NICTALT2my-en2019/07/22 17:06:592816---30.15------NMTYesSingle model+language model pre-training+back-translation
3FBAIALT2my-en2019/07/27 08:05:363148---26.75------NMTNoensemble 5 models, (BT iter1 + Self-Training)
4NICT-4ALT2my-en2019/07/26 11:31:522977---24.75------OtherYesSame as last year but with cleaner monolingual data
5sakuraALT2my-en2021/04/20 23:35:565230---19.75------NMTNoMarian NMT with Attention
6UCSYNLPALT2my-en2019/07/29 13:51:403252---19.64------NMTNoNMT with Attention
7sakuraALT2my-en2021/05/03 11:22:265990---18.70------NMTNoMarian NMT with Attention
8NICTALT2my-en2019/07/23 13:40:552854---18.51------NMTNoSingle model+language model pre-training
9ORGANIZERALT2my-en2019/07/22 19:08:032826---14.85------NMTNoNMT with Attention
10ORGANIZERALT2my-en2019/07/24 16:17:222899---14.59------OtherYesOnline A
11YCC-MT2ALT2my-en2021/06/10 17:29:436457---13.01------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, word-word, weight: 0.6, 0.4
12YCC-MT2ALT2my-en2021/06/10 16:51:576450---12.85------NMTNoUCSY+ALt, s2s-plus-transformer, my-en, syl-word, weight: 0.6 0.4
13YCC-MT2ALT2my-en2021/06/10 17:15:496456---12.80------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, word-word, weight: 0.5, 0.5
14YCC-MT2ALT2my-en2021/06/10 17:10:386455---12.77------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, word-word
15YCC-MT2ALT2my-en2021/06/10 16:48:016449---12.72------NMTNoUCSY+ALt, s2s-plus-transformer, my-en, syl-word, weight: 0.5 0.5
16YCC-MT2ALT2my-en2021/06/10 16:42:516448---12.48------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, syl-word
17YCC-MT2ALT2my-en2021/06/10 15:56:146439---11.86------NMTNoUCSY+ALT, s2s, my-en, syl-word
18YCC-MT2ALT2my-en2021/06/10 16:15:486443---11.50------NMTNoUCSY+ALT, s2s, my-en, word-word
19YCC-MT2ALT2my-en2021/06/10 16:03:286440---10.80------NMTNoUCSY+ALT, transformer, my-en, syl-word
20UCSMNLPALT2my-en2019/07/26 17:01:163022---10.70------SMTYes
21YCC-MT2ALT2my-en2021/06/10 16:20:496444---10.37------NMTNoUCSY+ALT, transformer, my-en, word-word
22YCC-MT2ALT2my-en2021/05/04 08:44:596181--- 8.04------NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.6 0.4 result (i.e. best BLEU score), Myanmar-Syllable to English Translation
23YCC-MT2ALT2my-en2021/05/06 20:47:226398--- 7.92------NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), Myanmar Word to English Translation, used in-house myWord Segmenter (release soon)
24NECTECALT2my-en2021/05/22 19:29:106400--- 6.72------NMTNos2s or RNN-based Attention, pos2string Myanmar-English, used myPOS ver 2.0 POS Tagger, used 2 GPUs
25NECTECALT2my-en2021/05/04 09:01:276188--- 6.24------NMTNoTransformer pos2string Myanmar-English, used myPOS ver 2.0 POS Tagger, used 2 GPUs
26NECTECALT2my-en2021/05/22 19:45:146402--- 6.13------NMTNoShared-multi-s2s, Archi: s2s, source1:string, source2:pos, target:string, used 2 GPUs
27NECTECALT2my-en2021/05/22 19:37:146401--- 4.73------NMTNoMulti-source Archi: s2s, source1:string, source2:pos, target:string, used 2 GPUs
28NECTECALT2my-en2021/05/04 09:31:576192--- 4.62------NMTNoShared-multi-source Transformer, archi: Transformer, source1:string, source2:POS, target:string, used 2 GPUs

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1FBAIALT2my-en2019/07/27 14:36:583201---0.840001------NMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
2NICTALT2my-en2019/07/22 17:06:592816---0.791705------NMTYesSingle model+language model pre-training+back-translation
3FBAIALT2my-en2019/07/27 08:05:363148---0.783571------NMTNoensemble 5 models, (BT iter1 + Self-Training)
4NICT-4ALT2my-en2019/07/26 11:31:522977---0.760394------OtherYesSame as last year but with cleaner monolingual data
5NICTALT2my-en2019/07/23 13:40:552854---0.744808------NMTNoSingle model+language model pre-training
6sakuraALT2my-en2021/04/20 23:35:565230---0.742698------NMTNoMarian NMT with Attention
7sakuraALT2my-en2021/05/03 11:22:265990---0.736523------NMTNoMarian NMT with Attention
8UCSYNLPALT2my-en2019/07/29 13:51:403252---0.707789------NMTNoNMT with Attention
9ORGANIZERALT2my-en2019/07/22 19:08:032826---0.700166------NMTNoNMT with Attention
10YCC-MT2ALT2my-en2021/06/10 16:42:516448---0.692376------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, syl-word
11YCC-MT2ALT2my-en2021/06/10 16:48:016449---0.691281------NMTNoUCSY+ALt, s2s-plus-transformer, my-en, syl-word, weight: 0.5 0.5
12YCC-MT2ALT2my-en2021/06/10 16:51:576450---0.689418------NMTNoUCSY+ALt, s2s-plus-transformer, my-en, syl-word, weight: 0.6 0.4
13YCC-MT2ALT2my-en2021/06/10 17:15:496456---0.688797------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, word-word, weight: 0.5, 0.5
14YCC-MT2ALT2my-en2021/06/10 17:29:436457---0.686109------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, word-word, weight: 0.6, 0.4
15YCC-MT2ALT2my-en2021/06/10 17:10:386455---0.685502------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, word-word
16YCC-MT2ALT2my-en2021/06/10 16:03:286440---0.673755------NMTNoUCSY+ALT, transformer, my-en, syl-word
17YCC-MT2ALT2my-en2021/06/10 15:56:146439---0.673532------NMTNoUCSY+ALT, s2s, my-en, syl-word
18YCC-MT2ALT2my-en2021/06/10 16:15:486443---0.670478------NMTNoUCSY+ALT, s2s, my-en, word-word
19YCC-MT2ALT2my-en2021/06/10 16:20:496444---0.664105------NMTNoUCSY+ALT, transformer, my-en, word-word
20YCC-MT2ALT2my-en2021/05/04 08:44:596181---0.630357------NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.6 0.4 result (i.e. best BLEU score), Myanmar-Syllable to English Translation
21YCC-MT2ALT2my-en2021/05/06 20:47:226398---0.629755------NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), Myanmar Word to English Translation, used in-house myWord Segmenter (release soon)
22NECTECALT2my-en2021/05/04 09:01:276188---0.620840------NMTNoTransformer pos2string Myanmar-English, used myPOS ver 2.0 POS Tagger, used 2 GPUs
23NECTECALT2my-en2021/05/22 19:29:106400---0.616469------NMTNos2s or RNN-based Attention, pos2string Myanmar-English, used myPOS ver 2.0 POS Tagger, used 2 GPUs
24NECTECALT2my-en2021/05/22 19:45:146402---0.609560------NMTNoShared-multi-s2s, Archi: s2s, source1:string, source2:pos, target:string, used 2 GPUs
25ORGANIZERALT2my-en2019/07/24 16:17:222899---0.602267------OtherYesOnline A
26NECTECALT2my-en2021/05/04 09:31:576192---0.587155------NMTNoShared-multi-source Transformer, archi: Transformer, source1:string, source2:POS, target:string, used 2 GPUs
27NECTECALT2my-en2021/05/22 19:37:146401---0.578146------NMTNoMulti-source Archi: s2s, source1:string, source2:pos, target:string, used 2 GPUs
28UCSMNLPALT2my-en2019/07/26 17:01:163022---0.570835------SMTYes

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1FBAIALT2my-en2019/07/27 14:36:583201---0.685200------NMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
2NICTALT2my-en2019/07/22 17:06:592816---0.649050------NMTYesSingle model+language model pre-training+back-translation
3FBAIALT2my-en2019/07/27 08:05:363148---0.627530------NMTNoensemble 5 models, (BT iter1 + Self-Training)
4NICT-4ALT2my-en2019/07/26 11:31:522977---0.579570------OtherYesSame as last year but with cleaner monolingual data
5NICTALT2my-en2019/07/23 13:40:552854---0.565430------NMTNoSingle model+language model pre-training
6sakuraALT2my-en2021/04/20 23:35:565230---0.562680------NMTNoMarian NMT with Attention
7sakuraALT2my-en2021/05/03 11:22:265990---0.550430------NMTNoMarian NMT with Attention
8ORGANIZERALT2my-en2019/07/24 16:17:222899---0.549380------OtherYesOnline A
9UCSMNLPALT2my-en2019/07/26 17:01:163022---0.538280------SMTYes
10UCSYNLPALT2my-en2019/07/29 13:51:403252---0.532640------NMTNoNMT with Attention
11ORGANIZERALT2my-en2019/07/22 19:08:032826---0.467460------NMTNoNMT with Attention
12YCC-MT2ALT2my-en2021/06/10 16:03:286440---0.462440------NMTNoUCSY+ALT, transformer, my-en, syl-word
13YCC-MT2ALT2my-en2021/06/10 16:20:496444---0.461980------NMTNoUCSY+ALT, transformer, my-en, word-word
14YCC-MT2ALT2my-en2021/06/10 17:10:386455---0.439350------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, word-word
15YCC-MT2ALT2my-en2021/06/10 16:48:016449---0.438520------NMTNoUCSY+ALt, s2s-plus-transformer, my-en, syl-word, weight: 0.5 0.5
16YCC-MT2ALT2my-en2021/06/10 16:42:516448---0.437760------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, syl-word
17YCC-MT2ALT2my-en2021/06/10 17:15:496456---0.437300------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, word-word, weight: 0.5, 0.5
18YCC-MT2ALT2my-en2021/06/10 16:51:576450---0.434960------NMTNoUCSY+ALt, s2s-plus-transformer, my-en, syl-word, weight: 0.6 0.4
19YCC-MT2ALT2my-en2021/06/10 17:29:436457---0.432530------NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, my-en, word-word, weight: 0.6, 0.4
20YCC-MT2ALT2my-en2021/06/10 15:56:146439---0.430120------NMTNoUCSY+ALT, s2s, my-en, syl-word
21YCC-MT2ALT2my-en2021/06/10 16:15:486443---0.425310------NMTNoUCSY+ALT, s2s, my-en, word-word
22NECTECALT2my-en2021/05/04 09:01:276188---0.424640------NMTNoTransformer pos2string Myanmar-English, used myPOS ver 2.0 POS Tagger, used 2 GPUs
23YCC-MT2ALT2my-en2021/05/06 20:47:226398---0.420200------NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), Myanmar Word to English Translation, used in-house myWord Segmenter (release soon)
24YCC-MT2ALT2my-en2021/05/04 08:44:596181---0.407880------NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.6 0.4 result (i.e. best BLEU score), Myanmar-Syllable to English Translation
25NECTECALT2my-en2021/05/22 19:29:106400---0.395310------NMTNos2s or RNN-based Attention, pos2string Myanmar-English, used myPOS ver 2.0 POS Tagger, used 2 GPUs
26NECTECALT2my-en2021/05/04 09:31:576192---0.391710------NMTNoShared-multi-source Transformer, archi: Transformer, source1:string, source2:POS, target:string, used 2 GPUs
27NECTECALT2my-en2021/05/22 19:45:146402---0.376140------NMTNoShared-multi-s2s, Archi: s2s, source1:string, source2:pos, target:string, used 2 GPUs
28NECTECALT2my-en2021/05/22 19:37:146401---0.357150------NMTNoMulti-source Archi: s2s, source1:string, source2:pos, target:string, used 2 GPUs

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1sakuraALT2my-en2021/04/20 23:35:565230UnderwayNMTNoMarian NMT with Attention
2sakuraALT2my-en2021/05/03 11:22:265990UnderwayNMTNoMarian NMT with Attention
3NECTECALT2my-en2021/05/04 09:01:276188UnderwayNMTNoTransformer pos2string Myanmar-English, used myPOS ver 2.0 POS Tagger, used 2 GPUs
4NECTECALT2my-en2021/05/04 09:31:576192UnderwayNMTNoShared-multi-source Transformer, archi: Transformer, source1:string, source2:POS, target:string, used 2 GPUs

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NICTALT2my-en2019/07/22 17:06:592816UnderwayNMTYesSingle model+language model pre-training+back-translation
2NICT-4ALT2my-en2019/07/26 11:31:522977UnderwayOtherYesSame as last year but with cleaner monolingual data
3UCSMNLPALT2my-en2019/07/26 17:01:163022UnderwaySMTYes
4FBAIALT2my-en2019/07/27 08:05:363148UnderwayNMTNoensemble 5 models, (BT iter1 + Self-Training)
5FBAIALT2my-en2019/07/27 14:36:583201UnderwayNMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
6UCSYNLPALT2my-en2019/07/29 13:51:403252UnderwayNMTNoNMT with Attention

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02