NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1ORGANIZERJPCja-en2016/07/13 16:54:09977---30.80--- 0.00 0.00 0.00SMTNoPhrase-based SMT
2ORGANIZERJPCja-en2016/07/13 17:00:31979---32.23--- 0.00 0.00 0.00SMTNoHierarchical Phrase-based SMT
3ORGANIZERJPCja-en2016/07/13 17:12:09980---34.40--- 0.00 0.00 0.00SMTNoString-to-Tree SMT
4ORGANIZERJPCja-en2016/07/26 10:15:251035---35.77--- 0.00 0.00 0.00OtherYesOnline A (2016)
5ORGANIZERJPCja-en2016/07/26 13:43:211051---16.00--- 0.00 0.00 0.00OtherYesOnline B (2016)
6Kyoto-UJPCja-en2016/07/27 17:15:101057---33.85--- 0.00 0.00 0.00EBMTNoKyotoEBMT 2016 w/o reranking
7NICT-2JPCja-en2016/08/04 17:26:271080---35.68--- 0.00 0.00 0.00SMTNoPhrase-based SMT with Preordering + Domain Adaptation
8ORGANIZERJPCja-en2016/08/05 14:51:271088---21.00--- 0.00 0.00 0.00OtherYesRBMT C (2016)
9ORGANIZERJPCja-en2016/08/05 15:18:461090---21.57--- 0.00 0.00 0.00OtherYesRBMT A (2016)
10ORGANIZERJPCja-en2016/08/05 15:59:141095---18.38--- 0.00 0.00 0.00OtherYesRBMT B (2016)
11NICT-2JPCja-en2016/08/05 17:58:401103---36.06--- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
12bjtu_nlpJPCja-en2016/08/16 12:34:361149---41.62--- 0.00 0.00 0.00NMTNoRNN Encoder-Decoder with attention mechanism, single model
13ORGANIZERJPCja-en2016/11/16 11:06:501338---49.35--- 0.00 0.00 0.00NMTYesOnline A (2016/11/14)
14JAPIOJPCja-en2017/07/25 18:17:301455---44.07--- 0.00 0.00 0.00NMTNoOpenNMT(dbrnn)
15u-tkbJPCja-en2017/07/26 12:53:501472---37.31--- 0.00 0.00 0.00NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
16JAPIOJPCja-en2017/07/28 22:22:151574---49.00--- 0.00 0.00 0.00NMTYesCombination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
17JAPIOJPCja-en2017/07/29 10:49:011578---48.08--- 0.00 0.00 0.00NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
18CUNIJPCja-en2017/07/31 22:34:511666---38.29--- 0.00 0.00 0.00SMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding
19ORGANIZERJPCja-en2018/08/15 18:38:511965---44.08---- 0.00 0.00NMTNoNMT with Attention
20sarahJPCja-en2019/07/26 11:18:482969---47.07------NMTNoTransformer, ensemble of 4 models
21KNU_HyundaiJPCja-en2019/07/27 12:27:323189---48.71------NMTYesTransformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble
22goku20JPCja-en2020/09/21 12:15:474094---46.71------NMTNomBART pre-training transformer, single model
23goku20JPCja-en2020/09/22 00:08:484108---47.34------NMTNomBART pre-training transformer, ensemble of 3 models
24Bering LabJPCja-en2021/04/23 13:05:245419---48.44------NMTYesTransformer Ensemble with additional crawled parallel corpus
25tpt_watJPCja-en2021/04/27 02:32:275710---45.86------NMTNoBase Transformer model with separate vocab, size 8k

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1ORGANIZERJPCja-en2016/07/13 16:54:09977---0.730056---0.0000000.0000000.000000SMTNoPhrase-based SMT
2ORGANIZERJPCja-en2016/07/13 17:00:31979---0.763030---0.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT
3ORGANIZERJPCja-en2016/07/13 17:12:09980---0.793483---0.0000000.0000000.000000SMTNoString-to-Tree SMT
4ORGANIZERJPCja-en2016/07/26 10:15:251035---0.803661---0.0000000.0000000.000000OtherYesOnline A (2016)
5ORGANIZERJPCja-en2016/07/26 13:43:211051---0.688004---0.0000000.0000000.000000OtherYesOnline B (2016)
6Kyoto-UJPCja-en2016/07/27 17:15:101057---0.800841---0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
7NICT-2JPCja-en2016/08/04 17:26:271080---0.824398---0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering + Domain Adaptation
8ORGANIZERJPCja-en2016/08/05 14:51:271088---0.755017---0.0000000.0000000.000000OtherYesRBMT C (2016)
9ORGANIZERJPCja-en2016/08/05 15:18:461090---0.750381---0.0000000.0000000.000000OtherYesRBMT A (2016)
10ORGANIZERJPCja-en2016/08/05 15:59:141095---0.710992---0.0000000.0000000.000000OtherYesRBMT B (2016)
11NICT-2JPCja-en2016/08/05 17:58:401103---0.825420---0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
12bjtu_nlpJPCja-en2016/08/16 12:34:361149---0.851975---0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
13ORGANIZERJPCja-en2016/11/16 11:06:501338---0.878342---0.0000000.0000000.000000NMTYesOnline A (2016/11/14)
14JAPIOJPCja-en2017/07/25 18:17:301455---0.863385---0.0000000.0000000.000000NMTNoOpenNMT(dbrnn)
15u-tkbJPCja-en2017/07/26 12:53:501472---0.841136---0.0000000.0000000.000000NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
16JAPIOJPCja-en2017/07/28 22:22:151574---0.878298---0.0000000.0000000.000000NMTYesCombination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
17JAPIOJPCja-en2017/07/29 10:49:011578---0.873093---0.0000000.0000000.000000NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
18CUNIJPCja-en2017/07/31 22:34:511666---0.837425---0.0000000.0000000.000000SMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding
19ORGANIZERJPCja-en2018/08/15 18:38:511965---0.859168----0.0000000.000000NMTNoNMT with Attention
20sarahJPCja-en2019/07/26 11:18:482969---0.870997------NMTNoTransformer, ensemble of 4 models
21KNU_HyundaiJPCja-en2019/07/27 12:27:323189---0.879063------NMTYesTransformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble
22goku20JPCja-en2020/09/21 12:15:474094---0.869896------NMTNomBART pre-training transformer, single model
23goku20JPCja-en2020/09/22 00:08:484108---0.872474------NMTNomBART pre-training transformer, ensemble of 3 models
24Bering LabJPCja-en2021/04/23 13:05:245419---0.879638------NMTYesTransformer Ensemble with additional crawled parallel corpus
25tpt_watJPCja-en2021/04/27 02:32:275710---0.866360------NMTNoBase Transformer model with separate vocab, size 8k

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1ORGANIZERJPCja-en2016/07/13 16:54:09977---0.664830---0.0000000.0000000.000000SMTNoPhrase-based SMT
2ORGANIZERJPCja-en2016/07/13 17:00:31979---0.672500---0.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT
3ORGANIZERJPCja-en2016/07/13 17:12:09980---0.672760---0.0000000.0000000.000000SMTNoString-to-Tree SMT
4ORGANIZERJPCja-en2016/07/26 10:15:251035---0.673950---0.0000000.0000000.000000OtherYesOnline A (2016)
5ORGANIZERJPCja-en2016/07/26 13:43:211051---0.486450---0.0000000.0000000.000000OtherYesOnline B (2016)
6Kyoto-UJPCja-en2016/07/27 17:15:101057---0.672040---0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
7NICT-2JPCja-en2016/08/04 17:26:271080---0.667540---0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering + Domain Adaptation
8ORGANIZERJPCja-en2016/08/05 14:51:271088---0.519210---0.0000000.0000000.000000OtherYesRBMT C (2016)
9ORGANIZERJPCja-en2016/08/05 15:18:461090---0.521230---0.0000000.0000000.000000OtherYesRBMT A (2016)
10ORGANIZERJPCja-en2016/08/05 15:59:141095---0.518110---0.0000000.0000000.000000OtherYesRBMT B (2016)
11NICT-2JPCja-en2016/08/05 17:58:401103---0.672890---0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
12bjtu_nlpJPCja-en2016/08/16 12:34:361149---0.690750---0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
13ORGANIZERJPCja-en2016/11/16 11:06:501338---0.722590---0.0000000.0000000.000000NMTYesOnline A (2016/11/14)
14JAPIOJPCja-en2017/07/25 18:17:301455---0.699930---0.0000000.0000000.000000NMTNoOpenNMT(dbrnn)
15u-tkbJPCja-en2017/07/26 12:53:501472---0.697290---0.0000000.0000000.000000NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
16JAPIOJPCja-en2017/07/28 22:22:151574---0.724710---0.0000000.0000000.000000NMTYesCombination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
17JAPIOJPCja-en2017/07/29 10:49:011578---0.715560---0.0000000.0000000.000000NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
18CUNIJPCja-en2017/07/31 22:34:511666---0.681520---0.0000000.0000000.000000SMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding
19ORGANIZERJPCja-en2018/08/15 18:38:511965---0.699460----0.0000000.000000NMTNoNMT with Attention
20sarahJPCja-en2019/07/26 11:18:482969---0.000000------NMTNoTransformer, ensemble of 4 models
21KNU_HyundaiJPCja-en2019/07/27 12:27:323189---0.000000------NMTYesTransformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble
22goku20JPCja-en2020/09/21 12:15:474094---0.000000------NMTNomBART pre-training transformer, single model
23goku20JPCja-en2020/09/22 00:08:484108---0.000000------NMTNomBART pre-training transformer, ensemble of 3 models
24Bering LabJPCja-en2021/04/23 13:05:245419---0.576681------NMTYesTransformer Ensemble with additional crawled parallel corpus
25tpt_watJPCja-en2021/04/27 02:32:275710---0.569295------NMTNoBase Transformer model with separate vocab, size 8k

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1sarahJPCja-en2019/07/26 11:18:482969UnderwayNMTNoTransformer, ensemble of 4 models
2KNU_HyundaiJPCja-en2019/07/27 12:27:323189UnderwayNMTYesTransformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1JAPIOJPCja-en2017/07/28 22:22:15157468.500NMTYesCombination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
2JAPIOJPCja-en2017/07/29 10:49:01157867.000NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
3CUNIJPCja-en2017/07/31 22:34:51166658.000SMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding
4u-tkbJPCja-en2017/07/26 12:53:50147251.500NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1ORGANIZERJPCja-en2016/11/16 11:06:50133871.500NMTYesOnline A (2016/11/14)
2bjtu_nlpJPCja-en2016/08/16 12:34:36114941.500NMTNoRNN Encoder-Decoder with attention mechanism, single model
3ORGANIZERJPCja-en2016/07/26 10:15:25103532.250OtherYesOnline A (2016)
4NICT-2JPCja-en2016/08/04 17:26:27108025.000SMTNoPhrase-based SMT with Preordering + Domain Adaptation
5NICT-2JPCja-en2016/08/05 17:58:40110324.250SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
6ORGANIZERJPCja-en2016/08/05 15:18:46109023.750OtherYesRBMT A (2016)
7ORGANIZERJPCja-en2016/07/13 17:12:0998023.000SMTNoString-to-Tree SMT
8ORGANIZERJPCja-en2016/07/13 17:00:319798.750SMTNoHierarchical Phrase-based SMT

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02