NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1cvitHINDENhi-en2019/05/27 16:04:362681---22.91---- 0.00 0.00NMTYesmassive-multi + bt
2cvitHINDENhi-en2019/03/15 01:23:172643---22.62---- 0.00 0.00NMTYesmassive-multi
3XMUNLPHINDENhi-en2017/07/27 23:00:461511---22.44--- 0.00 0.00 0.00NMTNoensemble of 4 nmt models + monolingual data
4ORGANIZERHINDENhi-en2016/07/26 10:04:531031---21.37--- 0.00 0.00 0.00OtherYesOnline A (2016)
5cvitHINDENhi-en2018/11/06 15:51:542563---21.04---- 0.00 0.00NMTYes
6cvitHINDENhi-en2019/03/15 01:33:222645---20.66---- 0.00 0.00NMTYesmassive-multi + ft
7cvitHINDENhi-en2018/09/14 13:21:462331---20.63---- 0.00 0.00NMTYesConvS2S Model Uses External Data
8XMUNLPHINDENhi-en2017/07/26 22:54:461488---20.61--- 0.00 0.00 0.00NMTNosingle nmt model + monolingual data
9NICT-5HINDENhi-en2019/07/23 17:36:362865---19.06------NMTNoHiEn and TaEn mixed training NMT model. Transformer on t2t (Hi-En is external data)
10LTRC-MTHINDENhi-en2019/07/27 04:49:363119---18.64------NMTNoTransformer Model with Backtranslation
11CUNIHINDENhi-en2018/09/15 03:10:302381---17.80---- 0.00 0.00NMTNoTransformer big, transfer learning from CS-EN 1M steps, only original HI-EN, beam=8; alpha=0.8; averaging of last 8 models after 230k steps
12LTRC-MTHINDENhi-en2019/07/27 05:34:143121---17.44------NMTNoLSTM with attention, Backtranslation, Reinforcement Learning for 1 epoch
13LTRC-MTHINDENhi-en2019/07/27 05:58:333124---17.07------NMTNoLSTM with global attention & Backtranslation
14LTRC-MTHINDENhi-en2019/07/27 04:04:093117---16.32------NMTNoTransformer Baseline, Only IIT-B data
15ORGANIZERHINDENhi-en2016/07/26 13:25:181048---15.58--- 0.00 0.00 0.00OtherYesOnline B (2016)
16ORGANIZERHINDENhi-en2018/11/13 14:57:122567---15.44---- 0.00 0.00NMTNoNMT with Attention
17XMUNLPHINDENhi-en2017/07/24 08:47:291427---13.30--- 0.00 0.00 0.00NMTNosingle nmt model
18IITB-MTGHINDENhi-en2017/08/01 15:10:091726---11.55--- 0.00 0.00 0.00NMTNoNMT with ensemble (last 3 + best validation)
19cvitHINDENhi-en2019/03/22 05:52:472658---10.76---- 0.00 0.00NMTYesmany to en (Transformer model) trained on WAT2018 data. Detokenized!
20ORGANIZERHINDENhi-en2016/07/26 15:44:201054---10.32--- 0.00 0.00 0.00SMTNoPhrase-based SMT
21IITP-MTHINDENhi-en2016/08/29 15:10:411289--- 9.62--- 0.00 0.00 0.00SMTNoHierarchical SMT

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1cvitHINDENhi-en2019/05/27 16:04:362681---0.768324----0.0000000.000000NMTYesmassive-multi + bt
2cvitHINDENhi-en2019/03/15 01:23:172643---0.766180----0.0000000.000000NMTYesmassive-multi
3cvitHINDENhi-en2019/03/15 01:33:222645---0.758910----0.0000000.000000NMTYesmassive-multi + ft
4cvitHINDENhi-en2018/11/06 15:51:542563---0.755941----0.0000000.000000NMTYes
5cvitHINDENhi-en2018/09/14 13:21:462331---0.751883----0.0000000.000000NMTYesConvS2S Model Uses External Data
6XMUNLPHINDENhi-en2017/07/27 23:00:461511---0.750921---0.0000000.0000000.000000NMTNoensemble of 4 nmt models + monolingual data
7XMUNLPHINDENhi-en2017/07/26 22:54:461488---0.743656---0.0000000.0000000.000000NMTNosingle nmt model + monolingual data
8NICT-5HINDENhi-en2019/07/23 17:36:362865---0.741197------NMTNoHiEn and TaEn mixed training NMT model. Transformer on t2t (Hi-En is external data)
9LTRC-MTHINDENhi-en2019/07/27 04:49:363119---0.735358------NMTNoTransformer Model with Backtranslation
10LTRC-MTHINDENhi-en2019/07/27 05:34:143121---0.735357------NMTNoLSTM with attention, Backtranslation, Reinforcement Learning for 1 epoch
11CUNIHINDENhi-en2018/09/15 03:10:302381---0.731727----0.0000000.000000NMTNoTransformer big, transfer learning from CS-EN 1M steps, only original HI-EN, beam=8; alpha=0.8; averaging of last 8 models after 230k steps
12LTRC-MTHINDENhi-en2019/07/27 04:04:093117---0.729072------NMTNoTransformer Baseline, Only IIT-B data
13LTRC-MTHINDENhi-en2019/07/27 05:58:333124---0.729059------NMTNoLSTM with global attention & Backtranslation
14ORGANIZERHINDENhi-en2018/11/13 14:57:122567---0.718751----0.0000000.000000NMTNoNMT with Attention
15ORGANIZERHINDENhi-en2016/07/26 10:04:531031---0.714537---0.0000000.0000000.000000OtherYesOnline A (2016)
16XMUNLPHINDENhi-en2017/07/24 08:47:291427---0.697707---0.0000000.0000000.000000NMTNosingle nmt model
17ORGANIZERHINDENhi-en2016/07/26 13:25:181048---0.683214---0.0000000.0000000.000000OtherYesOnline B (2016)
18IITB-MTGHINDENhi-en2017/08/01 15:10:091726---0.682902---0.0000000.0000000.000000NMTNoNMT with ensemble (last 3 + best validation)
19cvitHINDENhi-en2019/03/22 05:52:472658---0.667353----0.0000000.000000NMTYesmany to en (Transformer model) trained on WAT2018 data. Detokenized!
20ORGANIZERHINDENhi-en2016/07/26 15:44:201054---0.638090---0.0000000.0000000.000000SMTNoPhrase-based SMT
21IITP-MTHINDENhi-en2016/08/29 15:10:411289---0.628666---0.0000000.0000000.000000SMTNoHierarchical SMT

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1cvitHINDENhi-en2019/05/27 16:04:362681---0.641730----0.0000000.000000NMTYesmassive-multi + bt
2cvitHINDENhi-en2019/03/15 01:23:172643---0.637230----0.0000000.000000NMTYesmassive-multi
3cvitHINDENhi-en2019/03/15 01:33:222645---0.631250----0.0000000.000000NMTYesmassive-multi + ft
4XMUNLPHINDENhi-en2017/07/27 23:00:461511---0.629530---0.0000000.0000000.000000NMTNoensemble of 4 nmt models + monolingual data
5cvitHINDENhi-en2018/11/06 15:51:542563---0.628600----0.0000000.000000NMTYes
6XMUNLPHINDENhi-en2017/07/26 22:54:461488---0.627190---0.0000000.0000000.000000NMTNosingle nmt model + monolingual data
7cvitHINDENhi-en2018/09/14 13:21:462331---0.623240----0.0000000.000000NMTYesConvS2S Model Uses External Data
8ORGANIZERHINDENhi-en2016/07/26 10:04:531031---0.621100---0.0000000.0000000.000000OtherYesOnline A (2016)
9CUNIHINDENhi-en2018/09/15 03:10:302381---0.611090----0.0000000.000000NMTNoTransformer big, transfer learning from CS-EN 1M steps, only original HI-EN, beam=8; alpha=0.8; averaging of last 8 models after 230k steps
10LTRC-MTHINDENhi-en2019/07/27 04:49:363119---0.594770------NMTNoTransformer Model with Backtranslation
11LTRC-MTHINDENhi-en2019/07/27 05:34:143121---0.594550------NMTNoLSTM with attention, Backtranslation, Reinforcement Learning for 1 epoch
12ORGANIZERHINDENhi-en2016/07/26 13:25:181048---0.590520---0.0000000.0000000.000000OtherYesOnline B (2016)
13LTRC-MTHINDENhi-en2019/07/27 05:58:333124---0.587060------NMTNoLSTM with global attention & Backtranslation
14ORGANIZERHINDENhi-en2018/11/13 14:57:122567---0.586360----0.0000000.000000NMTNoNMT with Attention
15ORGANIZERHINDENhi-en2016/07/26 15:44:201054---0.574850---0.0000000.0000000.000000SMTNoPhrase-based SMT
16XMUNLPHINDENhi-en2017/07/24 08:47:291427---0.568010---0.0000000.0000000.000000NMTNosingle nmt model
17IITP-MTHINDENhi-en2016/08/29 15:10:411289---0.567370---0.0000000.0000000.000000SMTNoHierarchical SMT
18NICT-5HINDENhi-en2019/07/23 17:36:362865---0.566490------NMTNoHiEn and TaEn mixed training NMT model. Transformer on t2t (Hi-En is external data)
19LTRC-MTHINDENhi-en2019/07/27 04:04:093117---0.563590------NMTNoTransformer Baseline, Only IIT-B data
20IITB-MTGHINDENhi-en2017/08/01 15:10:091726---0.557040---0.0000000.0000000.000000NMTNoNMT with ensemble (last 3 + best validation)
21cvitHINDENhi-en2019/03/22 05:52:472658---0.554700----0.0000000.000000NMTYesmany to en (Transformer model) trained on WAT2018 data. Detokenized!

Notice:

Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1cvitHINDENhi-en2019/05/27 16:04:362681UnderwayNMTYesmassive-multi + bt
2NICT-5HINDENhi-en2019/07/23 17:36:362865UnderwayNMTNoHiEn and TaEn mixed training NMT model. Transformer on t2t (Hi-En is external data)
3LTRC-MTHINDENhi-en2019/07/27 04:49:363119UnderwayNMTNoTransformer Model with Backtranslation

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1cvitHINDENhi-en2018/09/14 13:21:46233172.250NMTYesConvS2S Model Uses External Data
2CUNIHINDENhi-en2018/09/15 03:10:30238167.250NMTNoTransformer big, transfer learning from CS-EN 1M steps, only original HI-EN, beam=8; alpha=0.8; averaging of last 8 models after 230k steps

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1XMUNLPHINDENhi-en2017/07/27 23:00:46151168.250NMTNoensemble of 4 nmt models + monolingual data
2IITB-MTGHINDENhi-en2017/08/01 15:10:09172621.000NMTNoNMT with ensemble (last 3 + best validation)

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1ORGANIZERHINDENhi-en2016/07/26 10:04:53103144.750OtherYesOnline A (2016)
2ORGANIZERHINDENhi-en2016/07/26 13:25:18104814.000OtherYesOnline B (2016)

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02