NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1NICTALT2en-my2019/07/22 17:15:422818--------30.84-NMTYesSingle model+language model pre-training
2NICTALT2en-my2019/07/22 17:16:442819--------25.88-NMTNoSingle model+language model pre-training
3ORGANIZERALT2en-my2019/07/22 19:12:362827--------22.53-NMTNoNMT with Attention
4NICT-4ALT2en-my2019/07/23 14:09:562855--------30.79-NMTYes
5UCSYNLPALT2en-my2019/07/23 15:28:072858--------20.86-NMTNoNMT with attention
6ORGANIZERALT2en-my2019/07/24 16:18:022900--------20.63-OtherYesOnline A
7NICT-4ALT2en-my2019/07/26 11:33:552979--------31.33-OtherYesSame as last year but with cleaner monolingual data
8UCSMNLPALT2en-my2019/07/26 17:00:183021--------28.20-SMTYes
9sarahALT2en-my2019/07/27 07:28:013137--------19.64-NMTNoSingle model. Transformer. 5 layers, 2 heads, random 0
10sarahALT2en-my2019/07/27 08:01:013146--------19.94-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 0 + 1
11sarahALT2en-my2019/07/27 08:05:393147--------19.75-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 1
12FBAIALT2en-my2019/07/27 08:10:073149--------36.77-NMTNoensemble 5 models, (BT iter1 + Self-Training)
13FBAIALT2en-my2019/07/27 14:38:043203--------39.25-NMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
14sakuraALT2en-my2021/04/24 17:04:175464--------29.25-NMTNoMarian NMT with Attention
15sakuraALT2en-my2021/05/03 21:39:246031--------29.62-NMTNoMarian NMT with Attention
16YCC-MT2ALT2en-my2021/05/04 08:06:576175--------14.82-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), English-Myanmar (Syllable Unit)
17YCC-MT2ALT2en-my2021/05/04 08:27:406178--------14.02-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.4 0.6 result (i.e. best BLEU score), English-tree to Myanmar-Syllable Translation
18NECTECALT2en-my2021/05/04 08:50:276182--------12.94-NMTNoMulti-source Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
19NECTECALT2en-my2021/05/04 08:52:286183--------12.72-NMTNoTransformer tree2string, used NLTK tree parser for English, used 2 GPUs
20NECTECALT2en-my2021/05/04 08:54:386186--------13.90-NMTNoShared-multi-transformer Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
21NECTECALT2en-my2021/05/04 09:06:136189--------12.82-NMTNoMulti-source Archi: s2s, source1:string, source2:tree, target:string, used 2 GPUs
22YCC-MT1ALT2en-my2021/05/04 10:06:256195--------20.88-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid PBSMT with XML markup technique, -xml-input inclusive result
23YCC-MT1ALT2en-my2021/05/04 11:08:406201--------20.13-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid OSM with XML markup technique, -xml-input inclusive result
24YCC-MT1ALT2en-my2021/05/05 21:00:096394--------21.02-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input exclusive res
25YCC-MT1ALT2en-my2021/05/05 21:07:486395--------21.02-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input inclusive
26YCC-MT1ALT2en-my2021/05/25 08:52:376408--------25.11-SMTNoUCSY+ALT, Hybrid SMT, -xml-input inclusive result
27YCC-MT1ALT2en-my2021/05/25 11:19:176420--------25.48-SMTNoUCSY+ALT, Hybrid Hiero, -xml-input exclusive result, en-my
28YCC-MT2ALT2en-my2021/06/10 15:40:576437--------17.24-NMTNoUCSY+ALT, s2s, en-my, word-syl
29YCC-MT2ALT2en-my2021/06/10 15:51:146438--------18.71-NMTNoUCSY+ALT, transformer, en-my, word-syl
30YCC-MT2ALT2en-my2021/06/10 16:07:546441--------15.38-NMTNoUCSY+ALT, s2s, en-my, word-word
31YCC-MT2ALT2en-my2021/06/10 16:11:446442--------14.66-NMTNoUCSY+ALT, transformer, en-my, word-word
32YCC-MT2ALT2en-my2021/06/10 16:27:496445--------19.75-NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
33YCC-MT2ALT2en-my2021/06/10 16:31:316446--------19.92-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
34YCC-MT2ALT2en-my2021/06/10 16:36:466447--------19.57-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.6 0.4
35YCC-MT2ALT2en-my2021/06/10 16:56:456451--------16.41-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.4 0.6
36YCC-MT2ALT2en-my2021/06/10 17:00:396452--------16.46-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.5 0.5
37YCC-MT2ALT2en-my2021/06/10 17:07:116454--------16.23-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.6 0.4

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1NICTALT2en-my2019/07/22 17:15:422818--------0.760247-NMTYesSingle model+language model pre-training
2NICTALT2en-my2019/07/22 17:16:442819--------0.727299-NMTNoSingle model+language model pre-training
3ORGANIZERALT2en-my2019/07/22 19:12:362827--------0.670375-NMTNoNMT with Attention
4NICT-4ALT2en-my2019/07/23 14:09:562855--------0.750353-NMTYes
5UCSYNLPALT2en-my2019/07/23 15:28:072858--------0.698507-NMTNoNMT with attention
6ORGANIZERALT2en-my2019/07/24 16:18:022900--------0.680398-OtherYesOnline A
7NICT-4ALT2en-my2019/07/26 11:33:552979--------0.732998-OtherYesSame as last year but with cleaner monolingual data
8UCSMNLPALT2en-my2019/07/26 17:00:183021--------0.596811-SMTYes
9sarahALT2en-my2019/07/27 07:28:013137--------0.711348-NMTNoSingle model. Transformer. 5 layers, 2 heads, random 0
10sarahALT2en-my2019/07/27 08:01:013146--------0.716243-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 0 + 1
11sarahALT2en-my2019/07/27 08:05:393147--------0.719219-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 1
12FBAIALT2en-my2019/07/27 08:10:073149--------0.769902-NMTNoensemble 5 models, (BT iter1 + Self-Training)
13FBAIALT2en-my2019/07/27 14:38:043203--------0.782472-NMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
14sakuraALT2en-my2021/04/24 17:04:175464--------0.751248-NMTNoMarian NMT with Attention
15sakuraALT2en-my2021/05/03 21:39:246031--------0.739320-NMTNoMarian NMT with Attention
16YCC-MT2ALT2en-my2021/05/04 08:06:576175--------0.659582-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), English-Myanmar (Syllable Unit)
17YCC-MT2ALT2en-my2021/05/04 08:27:406178--------0.639593-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.4 0.6 result (i.e. best BLEU score), English-tree to Myanmar-Syllable Translation
18NECTECALT2en-my2021/05/04 08:50:276182--------0.598012-NMTNoMulti-source Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
19NECTECALT2en-my2021/05/04 08:52:286183--------0.610951-NMTNoTransformer tree2string, used NLTK tree parser for English, used 2 GPUs
20NECTECALT2en-my2021/05/04 08:54:386186--------0.608810-NMTNoShared-multi-transformer Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
21NECTECALT2en-my2021/05/04 09:06:136189--------0.625476-NMTNoMulti-source Archi: s2s, source1:string, source2:tree, target:string, used 2 GPUs
22YCC-MT1ALT2en-my2021/05/04 10:06:256195--------0.553319-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid PBSMT with XML markup technique, -xml-input inclusive result
23YCC-MT1ALT2en-my2021/05/04 11:08:406201--------0.545962-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid OSM with XML markup technique, -xml-input inclusive result
24YCC-MT1ALT2en-my2021/05/05 21:00:096394--------0.588198-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input exclusive res
25YCC-MT1ALT2en-my2021/05/05 21:07:486395--------0.588198-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input inclusive
26YCC-MT1ALT2en-my2021/05/25 08:52:376408--------0.567187-SMTNoUCSY+ALT, Hybrid SMT, -xml-input inclusive result
27YCC-MT1ALT2en-my2021/05/25 11:19:176420--------0.607339-SMTNoUCSY+ALT, Hybrid Hiero, -xml-input exclusive result, en-my
28YCC-MT2ALT2en-my2021/06/10 15:40:576437--------0.675465-NMTNoUCSY+ALT, s2s, en-my, word-syl
29YCC-MT2ALT2en-my2021/06/10 15:51:146438--------0.680358-NMTNoUCSY+ALT, transformer, en-my, word-syl
30YCC-MT2ALT2en-my2021/06/10 16:07:546441--------0.659550-NMTNoUCSY+ALT, s2s, en-my, word-word
31YCC-MT2ALT2en-my2021/06/10 16:11:446442--------0.674845-NMTNoUCSY+ALT, transformer, en-my, word-word
32YCC-MT2ALT2en-my2021/06/10 16:27:496445--------0.698334-NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
33YCC-MT2ALT2en-my2021/06/10 16:31:316446--------0.701542-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
34YCC-MT2ALT2en-my2021/06/10 16:36:466447--------0.702318-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.6 0.4
35YCC-MT2ALT2en-my2021/06/10 16:56:456451--------0.687596-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.4 0.6
36YCC-MT2ALT2en-my2021/06/10 17:00:396452--------0.685076-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.5 0.5
37YCC-MT2ALT2en-my2021/06/10 17:07:116454--------0.679639-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.6 0.4

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1NICTALT2en-my2019/07/22 17:15:422818--------0.757590-NMTYesSingle model+language model pre-training
2NICTALT2en-my2019/07/22 17:16:442819--------0.746480-NMTNoSingle model+language model pre-training
3ORGANIZERALT2en-my2019/07/22 19:12:362827--------0.661920-NMTNoNMT with Attention
4NICT-4ALT2en-my2019/07/23 14:09:562855--------0.753200-NMTYes
5UCSYNLPALT2en-my2019/07/23 15:28:072858--------0.687260-NMTNoNMT with attention
6ORGANIZERALT2en-my2019/07/24 16:18:022900--------0.622430-OtherYesOnline A
7NICT-4ALT2en-my2019/07/26 11:33:552979--------0.756290-OtherYesSame as last year but with cleaner monolingual data
8UCSMNLPALT2en-my2019/07/26 17:00:183021--------0.693420-SMTYes
9sarahALT2en-my2019/07/27 07:28:013137--------0.713620-NMTNoSingle model. Transformer. 5 layers, 2 heads, random 0
10sarahALT2en-my2019/07/27 08:01:013146--------0.714740-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 0 + 1
11sarahALT2en-my2019/07/27 08:05:393147--------0.712580-NMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 1
12FBAIALT2en-my2019/07/27 08:10:073149--------0.782200-NMTNoensemble 5 models, (BT iter1 + Self-Training)
13FBAIALT2en-my2019/07/27 14:38:043203--------0.789410-NMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel
14sakuraALT2en-my2021/04/24 17:04:175464--------0.757860-NMTNoMarian NMT with Attention
15sakuraALT2en-my2021/05/03 21:39:246031--------0.752340-NMTNoMarian NMT with Attention
16YCC-MT2ALT2en-my2021/05/04 08:06:576175--------0.663840-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), English-Myanmar (Syllable Unit)
17YCC-MT2ALT2en-my2021/05/04 08:27:406178--------0.645470-NMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.4 0.6 result (i.e. best BLEU score), English-tree to Myanmar-Syllable Translation
18NECTECALT2en-my2021/05/04 08:50:276182--------0.654780-NMTNoMulti-source Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
19NECTECALT2en-my2021/05/04 08:52:286183--------0.645760-NMTNoTransformer tree2string, used NLTK tree parser for English, used 2 GPUs
20NECTECALT2en-my2021/05/04 08:54:386186--------0.645260-NMTNoShared-multi-transformer Archi: Transformer, source1:string, source2:tree, target:string, used 2 GPUs
21NECTECALT2en-my2021/05/04 09:06:136189--------0.638870-NMTNoMulti-source Archi: s2s, source1:string, source2:tree, target:string, used 2 GPUs
22YCC-MT1ALT2en-my2021/05/04 10:06:256195--------0.655310-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid PBSMT with XML markup technique, -xml-input inclusive result
23YCC-MT1ALT2en-my2021/05/04 11:08:406201--------0.654820-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid OSM with XML markup technique, -xml-input inclusive result
24YCC-MT1ALT2en-my2021/05/05 21:00:096394--------0.653840-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input exclusive res
25YCC-MT1ALT2en-my2021/05/05 21:07:486395--------0.653840-SMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid Hiero with XML markup technique, -xml-input inclusive
26YCC-MT1ALT2en-my2021/05/25 08:52:376408--------0.689400-SMTNoUCSY+ALT, Hybrid SMT, -xml-input inclusive result
27YCC-MT1ALT2en-my2021/05/25 11:19:176420--------0.684110-SMTNoUCSY+ALT, Hybrid Hiero, -xml-input exclusive result, en-my
28YCC-MT2ALT2en-my2021/06/10 15:40:576437--------0.674100-NMTNoUCSY+ALT, s2s, en-my, word-syl
29YCC-MT2ALT2en-my2021/06/10 15:51:146438--------0.678260-NMTNoUCSY+ALT, transformer, en-my, word-syl
30YCC-MT2ALT2en-my2021/06/10 16:07:546441--------0.672950-NMTNoUCSY+ALT, s2s, en-my, word-word
31YCC-MT2ALT2en-my2021/06/10 16:11:446442--------0.679630-NMTNoUCSY+ALT, transformer, en-my, word-word
32YCC-MT2ALT2en-my2021/06/10 16:27:496445--------0.681020-NMTNoUCSY+ALT, 2 model ensembling, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
33YCC-MT2ALT2en-my2021/06/10 16:31:316446--------0.688340-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.5 0.5
34YCC-MT2ALT2en-my2021/06/10 16:36:466447--------0.687470-NMTNoUCSY+ALT, s2s-plus-transformer, en-my, word-syl, weight: 0.6 0.4
35YCC-MT2ALT2en-my2021/06/10 16:56:456451--------0.688240-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.4 0.6
36YCC-MT2ALT2en-my2021/06/10 17:00:396452--------0.688820-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.5 0.5
37YCC-MT2ALT2en-my2021/06/10 17:07:116454--------0.681180-NMTNoUCSY+ALt, s2s-plus-transformer, en-my, word-word, weight: 0.6 0.4

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1sakuraALT2en-my2021/05/03 21:39:246031UnderwayNMTNoMarian NMT with Attention
2YCC-MT2ALT2en-my2021/05/04 08:06:576175UnderwayNMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.5 0.5 result (i.e. best BLEU score), English-Myanmar (Syllable Unit)
3YCC-MT2ALT2en-my2021/05/04 08:27:406178UnderwayNMTNo2 Model Ensemble (RNN Attention + Transformer), --weights 0.4 0.6 result (i.e. best BLEU score), English-tree to Myanmar-Syllable Translation
4YCC-MT1ALT2en-my2021/05/04 10:06:256195UnderwaySMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid PBSMT with XML markup technique, -xml-input inclusive result
5YCC-MT1ALT2en-my2021/05/04 11:08:406201UnderwaySMTNoUCSY only, Manually extracted transliterated Burmese word (including combined-words) and English word pairs from the whole ALT Myanmar corpus, run Hybrid OSM with XML markup technique, -xml-input inclusive result

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NICTALT2en-my2019/07/22 17:15:422818UnderwayNMTYesSingle model+language model pre-training
2UCSYNLPALT2en-my2019/07/23 15:28:072858UnderwayNMTNoNMT with attention
3NICT-4ALT2en-my2019/07/26 11:33:552979UnderwayOtherYesSame as last year but with cleaner monolingual data
4UCSMNLPALT2en-my2019/07/26 17:00:183021UnderwaySMTYes
5sarahALT2en-my2019/07/27 07:28:013137UnderwayNMTNoSingle model. Transformer. 5 layers, 2 heads, random 0
6sarahALT2en-my2019/07/27 08:01:013146UnderwayNMTNoEnsemble. Transformer. 5 layers, 2 heads, random 0 + 0 + 1
7FBAIALT2en-my2019/07/27 08:10:073149UnderwayNMTNoensemble 5 models, (BT iter1 + Self-Training)
8FBAIALT2en-my2019/07/27 14:38:043203UnderwayNMTYes5 model ensemble, (BT iter1 + Self-Training) + (BT+ST) iter2 + fine tuning + noisy channel

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02