NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1FBAIALT2en-my2019/07/27 14:38:043203--------39.25-NMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
2FBAIALT2en-my2019/07/27 08:10:073149--------36.77-NMTNoensemble 5 models, (BT iter1 + Self-Training)
3NICT-4ALT2en-my2019/07/26 11:33:552979--------31.33-OtherYesSame as last year but with cleaner monolingual data
4NICTALT2en-my2019/07/22 17:15:422818--------30.84-NMTYesSingle model+language model pre-training
5NICT-4ALT2en-my2019/07/23 14:09:562855--------30.79-NMTYes
6sakuraALT2en-my2021/05/03 21:39:246031--------29.62-NMTNoMarian NMT with Attention
7sakuraALT2en-my2021/04/24 17:04:175464--------29.25-NMTNoMarian NMT with Attention
8UCSMNLPALT2en-my2019/07/26 17:00:183021--------28.20-SMTYes
9NICTALT2en-my2019/07/22 17:16:442819--------25.88-NMTNoSingle model+language model pre-training
10YCC-MT1ALT2en-my2021/05/25 11:19:176420--------25.48-SMTNoUCSY+ALT, Hybrid Hiero, -xml-input exclusive result, en-my
11YCC-MT1ALT2en-my2021/05/25 08:52:376408--------25.11-SMTNoUCSY+ALT, Hybrid SMT, -xml-input inclusive result
12ORGANIZERALT2en-my2019/07/22 19:12:362827--------22.53-NMTNoNMT with Attention
13YCC-MT1ALT2en-my2021/05/05 21:00:096394--------21.02-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input exclusive res
14YCC-MT1ALT2en-my2021/05/05 21:07:486395--------21.02-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input inclusive
15YCC-MT1ALT2en-my2021/05/04 10:06:256195--------20.88-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid PBSMT with XML markup technique, -xml-input inclusive result
16UCSYNLPALT2en-my2019/07/23 15:28:072858--------20.86-NMTNoNMT with attention
17ORGANIZERALT2en-my2019/07/24 16:18:022900--------20.63-OtherYesOnline A
18YCC-MT1ALT2en-my2021/05/04 11:08:406201--------20.13-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid OSM with XML markup technique, -xml-input inclusive result
19sarahALT2en-my2019/07/27 08:01:013146--------19.94-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 0 + 1
20YCC-MT2ALT2en-my2021/06/10 16:31:316446--------19.92-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
21sarahALT2en-my2019/07/27 08:05:393147--------19.75-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 1
22YCC-MT2ALT2en-my2021/06/10 16:27:496445--------19.75-NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
23sarahALT2en-my2019/07/27 07:28:013137--------19.64-NMTNoSingle model. Transformer. 5 layers, 2 heads, random 0
24YCC-MT2ALT2en-my2021/06/10 16:36:466447--------19.57-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.6 0.4
25YCC-MT2ALT2en-my2021/06/10 15:51:146438--------18.71-NMTNoUCSY+ALT, transformer, en-my, word-syl
26YCC-MT2ALT2en-my2021/06/10 15:40:576437--------17.24-NMTNoUCSY+ALT, s2s, en-my, word-syl
27YCC-MT2ALT2en-my2021/06/10 17:00:396452--------16.46-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.5 0.5
28YCC-MT2ALT2en-my2021/06/10 16:56:456451--------16.41-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.4 0.6
29YCC-MT2ALT2en-my2021/06/10 17:07:116454--------16.23-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.6 0.4
30YCC-MT2ALT2en-my2021/06/10 16:07:546441--------15.38-NMTNoUCSY+ALT, s2s, en-my, word-word
31YCC-MT2ALT2en-my2021/05/04 08:06:576175--------14.82-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), English-Myanmar (Syllable Unit)
32YCC-MT2ALT2en-my2021/06/10 16:11:446442--------14.66-NMTNoUCSY+ALT, transformer, en-my, word-word
33YCC-MT2ALT2en-my2021/05/04 08:27:406178--------14.02-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.4 0.6 result (i.e. best BLEU score), English-tree to Myanmar-Syllable Translation
34NECTECALT2en-my2021/05/04 08:54:386186--------13.90-NMTNoShared-multi-transformer Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
35NECTECALT2en-my2021/05/04 08:50:276182--------12.94-NMTNoMulti-source Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
36NECTECALT2en-my2021/05/04 09:06:136189--------12.82-NMTNoMulti-source Archi: s2s, source1:string, source2:tree, target:string, used 2 GPUs
37NECTECALT2en-my2021/05/04 08:52:286183--------12.72-NMTNoTransformer tree2string, used NLTK tree parser for English, used 2 GPUs

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1FBAIALT2en-my2019/07/27 14:38:043203--------0.782472-NMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
2FBAIALT2en-my2019/07/27 08:10:073149--------0.769902-NMTNoensemble 5 models, (BT iter1 + Self-Training)
3NICTALT2en-my2019/07/22 17:15:422818--------0.760247-NMTYesSingle model+language model pre-training
4sakuraALT2en-my2021/04/24 17:04:175464--------0.751248-NMTNoMarian NMT with Attention
5NICT-4ALT2en-my2019/07/23 14:09:562855--------0.750353-NMTYes
6sakuraALT2en-my2021/05/03 21:39:246031--------0.739320-NMTNoMarian NMT with Attention
7NICT-4ALT2en-my2019/07/26 11:33:552979--------0.732998-OtherYesSame as last year but with cleaner monolingual data
8NICTALT2en-my2019/07/22 17:16:442819--------0.727299-NMTNoSingle model+language model pre-training
9sarahALT2en-my2019/07/27 08:05:393147--------0.719219-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 1
10sarahALT2en-my2019/07/27 08:01:013146--------0.716243-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 0 + 1
11sarahALT2en-my2019/07/27 07:28:013137--------0.711348-NMTNoSingle model. Transformer. 5 layers, 2 heads, random 0
12YCC-MT2ALT2en-my2021/06/10 16:36:466447--------0.702318-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.6 0.4
13YCC-MT2ALT2en-my2021/06/10 16:31:316446--------0.701542-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
14UCSYNLPALT2en-my2019/07/23 15:28:072858--------0.698507-NMTNoNMT with attention
15YCC-MT2ALT2en-my2021/06/10 16:27:496445--------0.698334-NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
16YCC-MT2ALT2en-my2021/06/10 16:56:456451--------0.687596-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.4 0.6
17YCC-MT2ALT2en-my2021/06/10 17:00:396452--------0.685076-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.5 0.5
18ORGANIZERALT2en-my2019/07/24 16:18:022900--------0.680398-OtherYesOnline A
19YCC-MT2ALT2en-my2021/06/10 15:51:146438--------0.680358-NMTNoUCSY+ALT, transformer, en-my, word-syl
20YCC-MT2ALT2en-my2021/06/10 17:07:116454--------0.679639-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.6 0.4
21YCC-MT2ALT2en-my2021/06/10 15:40:576437--------0.675465-NMTNoUCSY+ALT, s2s, en-my, word-syl
22YCC-MT2ALT2en-my2021/06/10 16:11:446442--------0.674845-NMTNoUCSY+ALT, transformer, en-my, word-word
23ORGANIZERALT2en-my2019/07/22 19:12:362827--------0.670375-NMTNoNMT with Attention
24YCC-MT2ALT2en-my2021/05/04 08:06:576175--------0.659582-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), English-Myanmar (Syllable Unit)
25YCC-MT2ALT2en-my2021/06/10 16:07:546441--------0.659550-NMTNoUCSY+ALT, s2s, en-my, word-word
26YCC-MT2ALT2en-my2021/05/04 08:27:406178--------0.639593-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.4 0.6 result (i.e. best BLEU score), English-tree to Myanmar-Syllable Translation
27NECTECALT2en-my2021/05/04 09:06:136189--------0.625476-NMTNoMulti-source Archi: s2s, source1:string, source2:tree, target:string, used 2 GPUs
28NECTECALT2en-my2021/05/04 08:52:286183--------0.610951-NMTNoTransformer tree2string, used NLTK tree parser for English, used 2 GPUs
29NECTECALT2en-my2021/05/04 08:54:386186--------0.608810-NMTNoShared-multi-transformer Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
30YCC-MT1ALT2en-my2021/05/25 11:19:176420--------0.607339-SMTNoUCSY+ALT, Hybrid Hiero, -xml-input exclusive result, en-my
31NECTECALT2en-my2021/05/04 08:50:276182--------0.598012-NMTNoMulti-source Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
32UCSMNLPALT2en-my2019/07/26 17:00:183021--------0.596811-SMTYes
33YCC-MT1ALT2en-my2021/05/05 21:00:096394--------0.588198-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input exclusive res
34YCC-MT1ALT2en-my2021/05/05 21:07:486395--------0.588198-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input inclusive
35YCC-MT1ALT2en-my2021/05/25 08:52:376408--------0.567187-SMTNoUCSY+ALT, Hybrid SMT, -xml-input inclusive result
36YCC-MT1ALT2en-my2021/05/04 10:06:256195--------0.553319-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid PBSMT with XML markup technique, -xml-input inclusive result
37YCC-MT1ALT2en-my2021/05/04 11:08:406201--------0.545962-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid OSM with XML markup technique, -xml-input inclusive result

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1FBAIALT2en-my2019/07/27 14:38:043203--------0.789410-NMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
2FBAIALT2en-my2019/07/27 08:10:073149--------0.782200-NMTNoensemble 5 models, (BT iter1 + Self-Training)
3sakuraALT2en-my2021/04/24 17:04:175464--------0.757860-NMTNoMarian NMT with Attention
4NICTALT2en-my2019/07/22 17:15:422818--------0.757590-NMTYesSingle model+language model pre-training
5NICT-4ALT2en-my2019/07/26 11:33:552979--------0.756290-OtherYesSame as last year but with cleaner monolingual data
6NICT-4ALT2en-my2019/07/23 14:09:562855--------0.753200-NMTYes
7sakuraALT2en-my2021/05/03 21:39:246031--------0.752340-NMTNoMarian NMT with Attention
8NICTALT2en-my2019/07/22 17:16:442819--------0.746480-NMTNoSingle model+language model pre-training
9sarahALT2en-my2019/07/27 08:01:013146--------0.714740-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 0 + 1
10sarahALT2en-my2019/07/27 07:28:013137--------0.713620-NMTNoSingle model. Transformer. 5 layers, 2 heads, random 0
11sarahALT2en-my2019/07/27 08:05:393147--------0.712580-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 1
12UCSMNLPALT2en-my2019/07/26 17:00:183021--------0.693420-SMTYes
13YCC-MT1ALT2en-my2021/05/25 08:52:376408--------0.689400-SMTNoUCSY+ALT, Hybrid SMT, -xml-input inclusive result
14YCC-MT2ALT2en-my2021/06/10 17:00:396452--------0.688820-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.5 0.5
15YCC-MT2ALT2en-my2021/06/10 16:31:316446--------0.688340-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
16YCC-MT2ALT2en-my2021/06/10 16:56:456451--------0.688240-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.4 0.6
17YCC-MT2ALT2en-my2021/06/10 16:36:466447--------0.687470-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.6 0.4
18UCSYNLPALT2en-my2019/07/23 15:28:072858--------0.687260-NMTNoNMT with attention
19YCC-MT1ALT2en-my2021/05/25 11:19:176420--------0.684110-SMTNoUCSY+ALT, Hybrid Hiero, -xml-input exclusive result, en-my
20YCC-MT2ALT2en-my2021/06/10 17:07:116454--------0.681180-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.6 0.4
21YCC-MT2ALT2en-my2021/06/10 16:27:496445--------0.681020-NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
22YCC-MT2ALT2en-my2021/06/10 16:11:446442--------0.679630-NMTNoUCSY+ALT, transformer, en-my, word-word
23YCC-MT2ALT2en-my2021/06/10 15:51:146438--------0.678260-NMTNoUCSY+ALT, transformer, en-my, word-syl
24YCC-MT2ALT2en-my2021/06/10 15:40:576437--------0.674100-NMTNoUCSY+ALT, s2s, en-my, word-syl
25YCC-MT2ALT2en-my2021/06/10 16:07:546441--------0.672950-NMTNoUCSY+ALT, s2s, en-my, word-word
26YCC-MT2ALT2en-my2021/05/04 08:06:576175--------0.663840-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), English-Myanmar (Syllable Unit)
27ORGANIZERALT2en-my2019/07/22 19:12:362827--------0.661920-NMTNoNMT with Attention
28YCC-MT1ALT2en-my2021/05/04 10:06:256195--------0.655310-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid PBSMT with XML markup technique, -xml-input inclusive result
29YCC-MT1ALT2en-my2021/05/04 11:08:406201--------0.654820-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid OSM with XML markup technique, -xml-input inclusive result
30NECTECALT2en-my2021/05/04 08:50:276182--------0.654780-NMTNoMulti-source Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
31YCC-MT1ALT2en-my2021/05/05 21:00:096394--------0.653840-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input exclusive res
32YCC-MT1ALT2en-my2021/05/05 21:07:486395--------0.653840-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input inclusive
33NECTECALT2en-my2021/05/04 08:52:286183--------0.645760-NMTNoTransformer tree2string, used NLTK tree parser for English, used 2 GPUs
34YCC-MT2ALT2en-my2021/05/04 08:27:406178--------0.645470-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.4 0.6 result (i.e. best BLEU score), English-tree to Myanmar-Syllable Translation
35NECTECALT2en-my2021/05/04 08:54:386186--------0.645260-NMTNoShared-multi-transformer Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
36NECTECALT2en-my2021/05/04 09:06:136189--------0.638870-NMTNoMulti-source Archi: s2s, source1:string, source2:tree, target:string, used 2 GPUs
37ORGANIZERALT2en-my2019/07/24 16:18:022900--------0.622430-OtherYesOnline A

Notice:

Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1sakuraALT2en-my2021/05/03 21:39:246031UnderwayNMTNoMarian NMT with Attention
2YCC-MT2ALT2en-my2021/05/04 08:06:576175UnderwayNMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), English-Myanmar (Syllable Unit)
3YCC-MT2ALT2en-my2021/05/04 08:27:406178UnderwayNMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.4 0.6 result (i.e. best BLEU score), English-tree to Myanmar-Syllable Translation
4YCC-MT1ALT2en-my2021/05/04 10:06:256195UnderwaySMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid PBSMT with XML markup technique, -xml-input inclusive result
5YCC-MT1ALT2en-my2021/05/04 11:08:406201UnderwaySMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid OSM with XML markup technique, -xml-input inclusive result

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NICTALT2en-my2019/07/22 17:15:422818UnderwayNMTYesSingle model+language model pre-training
2UCSYNLPALT2en-my2019/07/23 15:28:072858UnderwayNMTNoNMT with attention
3NICT-4ALT2en-my2019/07/26 11:33:552979UnderwayOtherYesSame as last year but with cleaner monolingual data
4UCSMNLPALT2en-my2019/07/26 17:00:183021UnderwaySMTYes
5sarahALT2en-my2019/07/27 07:28:013137UnderwayNMTNoSingle model. Transformer. 5 layers, 2 heads, random 0
6sarahALT2en-my2019/07/27 08:01:013146UnderwayNMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 0 + 1
7FBAIALT2en-my2019/07/27 08:10:073149UnderwayNMTNoensemble 5 models, (BT iter1 + Self-Training)
8FBAIALT2en-my2019/07/27 14:38:043203UnderwayNMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02