NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1NICT-5INDICte-en2018/08/24 15:00:502135---33.23---- 0.00 0.00NMTNoXX-XX transformer model
2cvitINDICte-en2019/03/23 18:11:422668---32.88---- 0.00 0.00NMTYesmassive-multi e270 + only-bn
3cvitINDICte-en2019/03/23 12:42:272663---32.44---- 0.00 0.00NMTYesmassive-multi e270
4IITP-MTINDICte-en2018/09/14 19:55:562350---30.96---- 0.00 0.00NMTNoTransformer multilingual XX-En
5cvitINDICte-en2019/03/14 22:08:582626---30.91---- 0.00 0.00NMTYesmassive-multi + ft
6NICT-5INDICte-en2018/09/07 14:32:112242---30.23---- 0.00 0.00NMTNoUnified Source vocabulary by orthography mapping. XX-EN model.
7NICT-5INDICte-en2018/08/24 14:51:322115---29.85---- 0.00 0.00SMTNoXX-En transformer model
8cvitINDICte-en2019/03/22 05:46:182656---29.70---- 0.00 0.00NMTYesmay to en (Transformer) - detokenized
9cvitINDICte-en2019/03/22 05:24:272648---29.18---- 0.00 0.00NMTNomany to en (Transformer)
10ORGANIZERINDICte-en2018/08/24 14:39:492102---27.86---- 0.00 0.00NMTNomulti2one multilingual NMT with Attention
11ORGANIZERINDICte-en2018/08/29 14:33:072195---27.10---- 0.00 0.00NMTNomulti2multi multilingual NMT with Attention
12cvitINDICte-en2019/03/14 21:52:372619---25.56---- 0.00 0.00NMTYesmassive-multi
13AnuvaadINDICte-en2018/09/15 17:53:342402---24.05---- 0.00 0.00SMTNoSMT with KenLM
14AnuvaadINDICte-en2018/09/15 20:14:552421---22.13---- 0.00 0.00SMTNoSMT XX-En
15RGNLPINDICte-en2018/09/15 02:52:262377---20.39---- 0.00 0.00SMTNoSMT system with SRILM Language model
16RGNLPINDICte-en2018/09/15 02:41:122371---20.36---- 0.00 0.00SMTNoSMT system with KENLM Language model
17RGNLPINDICte-en2018/09/15 03:21:002387---16.42---- 0.00 0.00NMTNoNMT system with a 2-layer LSTM method
18NICT-5INDICte-en2018/08/24 14:51:212114---15.76---- 0.00 0.00NMTNoBilingual transformer model
19ORGANIZERINDICte-en2018/08/20 11:22:392010---13.69---- 0.00 0.00NMTNoNMT with Attention

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1cvitINDICte-en2019/03/23 18:11:422668---0.820281----0.0000000.000000NMTYesmassive-multi e270 + only-bn
2cvitINDICte-en2019/03/14 22:08:582626---0.816129----0.0000000.000000NMTYesmassive-multi + ft
3cvitINDICte-en2019/03/23 12:42:272663---0.813025----0.0000000.000000NMTYesmassive-multi e270
4NICT-5INDICte-en2018/08/24 15:00:502135---0.810584----0.0000000.000000NMTNoXX-XX transformer model
5cvitINDICte-en2019/03/22 05:46:182656---0.808263----0.0000000.000000NMTYesmay to en (Transformer) - detokenized
6cvitINDICte-en2019/03/22 05:24:272648---0.804106----0.0000000.000000NMTNomany to en (Transformer)
7NICT-5INDICte-en2018/09/07 14:32:112242---0.802574----0.0000000.000000NMTNoUnified Source vocabulary by orthography mapping. XX-EN model.
8NICT-5INDICte-en2018/08/24 14:51:322115---0.800101----0.0000000.000000SMTNoXX-En transformer model
9IITP-MTINDICte-en2018/09/14 19:55:562350---0.797167----0.0000000.000000NMTNoTransformer multilingual XX-En
10ORGANIZERINDICte-en2018/08/24 14:39:492102---0.796554----0.0000000.000000NMTNomulti2one multilingual NMT with Attention
11cvitINDICte-en2019/03/14 21:52:372619---0.787013----0.0000000.000000NMTYesmassive-multi
12ORGANIZERINDICte-en2018/08/29 14:33:072195---0.785251----0.0000000.000000NMTNomulti2multi multilingual NMT with Attention
13AnuvaadINDICte-en2018/09/15 17:53:342402---0.729178----0.0000000.000000SMTNoSMT with KenLM
14AnuvaadINDICte-en2018/09/15 20:14:552421---0.714266----0.0000000.000000SMTNoSMT XX-En
15RGNLPINDICte-en2018/09/15 03:21:002387---0.712523----0.0000000.000000NMTNoNMT system with a 2-layer LSTM method
16NICT-5INDICte-en2018/08/24 14:51:212114---0.708651----0.0000000.000000NMTNoBilingual transformer model
17ORGANIZERINDICte-en2018/08/20 11:22:392010---0.698511----0.0000000.000000NMTNoNMT with Attention
18RGNLPINDICte-en2018/09/15 02:52:262377---0.698461----0.0000000.000000SMTNoSMT system with SRILM Language model
19RGNLPINDICte-en2018/09/15 02:41:122371---0.695340----0.0000000.000000SMTNoSMT system with KENLM Language model

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1cvitINDICte-en2019/03/23 12:42:272663---0.679500----0.0000000.000000NMTYesmassive-multi e270
2cvitINDICte-en2019/03/23 18:11:422668---0.677030----0.0000000.000000NMTYesmassive-multi e270 + only-bn
3cvitINDICte-en2019/03/14 22:08:582626---0.667160----0.0000000.000000NMTYesmassive-multi + ft
4IITP-MTINDICte-en2018/09/14 19:55:562350---0.663260----0.0000000.000000NMTNoTransformer multilingual XX-En
5NICT-5INDICte-en2018/08/24 15:00:502135---0.655490----0.0000000.000000NMTNoXX-XX transformer model
6cvitINDICte-en2019/03/22 05:24:272648---0.650580----0.0000000.000000NMTNomany to en (Transformer)
7cvitINDICte-en2019/03/22 05:46:182656---0.650580----0.0000000.000000NMTYesmay to en (Transformer) - detokenized
8cvitINDICte-en2019/03/14 21:52:372619---0.629430----0.0000000.000000NMTYesmassive-multi
9ORGANIZERINDICte-en2018/08/24 14:39:492102---0.624500----0.0000000.000000NMTNomulti2one multilingual NMT with Attention
10NICT-5INDICte-en2018/08/24 14:51:322115---0.624020----0.0000000.000000SMTNoXX-En transformer model
11NICT-5INDICte-en2018/09/07 14:32:112242---0.620200----0.0000000.000000NMTNoUnified Source vocabulary by orthography mapping. XX-EN model.
12ORGANIZERINDICte-en2018/08/29 14:33:072195---0.617900----0.0000000.000000NMTNomulti2multi multilingual NMT with Attention
13RGNLPINDICte-en2018/09/15 02:52:262377---0.616760----0.0000000.000000SMTNoSMT system with SRILM Language model
14RGNLPINDICte-en2018/09/15 02:41:122371---0.615020----0.0000000.000000SMTNoSMT system with KENLM Language model
15AnuvaadINDICte-en2018/09/15 17:53:342402---0.606850----0.0000000.000000SMTNoSMT with KenLM
16AnuvaadINDICte-en2018/09/15 20:14:552421---0.569170----0.0000000.000000SMTNoSMT XX-En
17RGNLPINDICte-en2018/09/15 03:21:002387---0.567870----0.0000000.000000NMTNoNMT system with a 2-layer LSTM method
18NICT-5INDICte-en2018/08/24 14:51:212114---0.533540----0.0000000.000000NMTNoBilingual transformer model
19ORGANIZERINDICte-en2018/08/20 11:22:392010---0.526920----0.0000000.000000NMTNoNMT with Attention

Notice:

Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02