NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1NICT-5INDICml-en2018/08/24 14:46:272106---22.87---- 0.00 0.00NMTNoBilingual transformer model
2IITP-MTINDICml-en2018/09/14 19:51:532348---19.94---- 0.00 0.00NMTNoTransformer multilingual XX-En
3NICT-5INDICml-en2018/09/07 14:30:022239---15.91---- 0.00 0.00NMTNoUnified Source vocabulary by orthography mapping. XX-EN model.
4cvitINDICml-en2019/03/22 05:42:392654---15.52---- 0.00 0.00NMTYesmay to en (Transformer) - detokenized
5cvitINDICml-en2019/03/22 04:46:022646---15.15---- 0.00 0.00NMTYesmany to en model (Transformer)
6cvitINDICml-en2019/03/23 12:40:102662---15.10---- 0.00 0.00NMTYesmassive-multi e270
7cvitINDICml-en2019/03/14 22:03:452625---14.77---- 0.00 0.00NMTYesmassive-multi + ft
8NICT-5INDICml-en2018/08/24 14:47:032107---14.06---- 0.00 0.00NMTNoXX-En transformer model
9cvitINDICml-en2019/03/14 21:50:112617---12.45---- 0.00 0.00NMTYesmassive-multi
10ORGANIZERINDICml-en2018/08/24 14:37:212100---12.32---- 0.00 0.00NMTNomulti2one multilingual NMT with Attention
11ORGANIZERINDICml-en2018/08/29 14:26:572191---11.62---- 0.00 0.00NMTNomulti2multi multilingual NMT with Attention
12AnuvaadINDICml-en2018/09/15 18:01:532407---11.51---- 0.00 0.00SMTNoSMT XX-En
13AnuvaadINDICml-en2018/09/15 18:48:332415---11.25---- 0.00 0.00SMTNoSMT with KenLM
14NICT-5INDICml-en2018/08/24 14:59:492131---10.90---- 0.00 0.00NMTNoXX-XX transformer model
15ORGANIZERINDICml-en2018/08/20 11:14:382006---10.56---- 0.00 0.00NMTNoNMT with Attention
16RGNLPINDICml-en2018/09/15 02:29:282368--- 8.78---- 0.00 0.00SMTNoSMT system with KENLM Language model
17RGNLPINDICml-en2018/09/15 02:49:572375--- 8.44---- 0.00 0.00SMTNoSMT system with SRILM Language model
18RGNLPINDICml-en2018/09/15 03:16:462384--- 6.90---- 0.00 0.00NMTNoNMT system with a 2-layer LSTM method

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1cvitINDICml-en2019/03/22 05:42:392654---0.737442----0.0000000.000000NMTYesmay to en (Transformer) - detokenized
2cvitINDICml-en2019/03/23 12:40:102662---0.732838----0.0000000.000000NMTYesmassive-multi e270
3IITP-MTINDICml-en2018/09/14 19:51:532348---0.731530----0.0000000.000000NMTNoTransformer multilingual XX-En
4cvitINDICml-en2019/03/14 22:03:452625---0.727629----0.0000000.000000NMTYesmassive-multi + ft
5cvitINDICml-en2019/03/22 04:46:022646---0.726010----0.0000000.000000NMTYesmany to en model (Transformer)
6NICT-5INDICml-en2018/08/24 14:46:272106---0.711117----0.0000000.000000NMTNoBilingual transformer model
7cvitINDICml-en2019/03/14 21:50:112617---0.710320----0.0000000.000000NMTYesmassive-multi
8NICT-5INDICml-en2018/08/24 14:47:032107---0.707105----0.0000000.000000NMTNoXX-En transformer model
9NICT-5INDICml-en2018/09/07 14:30:022239---0.704745----0.0000000.000000NMTNoUnified Source vocabulary by orthography mapping. XX-EN model.
10NICT-5INDICml-en2018/08/24 14:59:492131---0.695721----0.0000000.000000NMTNoXX-XX transformer model
11ORGANIZERINDICml-en2018/08/29 14:26:572191---0.689418----0.0000000.000000NMTNomulti2multi multilingual NMT with Attention
12ORGANIZERINDICml-en2018/08/24 14:37:212100---0.689298----0.0000000.000000NMTNomulti2one multilingual NMT with Attention
13ORGANIZERINDICml-en2018/08/20 11:14:382006---0.673579----0.0000000.000000NMTNoNMT with Attention
14RGNLPINDICml-en2018/09/15 03:16:462384---0.605902----0.0000000.000000NMTNoNMT system with a 2-layer LSTM method
15AnuvaadINDICml-en2018/09/15 18:01:532407---0.600102----0.0000000.000000SMTNoSMT XX-En
16AnuvaadINDICml-en2018/09/15 18:48:332415---0.566812----0.0000000.000000SMTNoSMT with KenLM
17RGNLPINDICml-en2018/09/15 02:29:282368---0.542151----0.0000000.000000SMTNoSMT system with KENLM Language model
18RGNLPINDICml-en2018/09/15 02:49:572375---0.528814----0.0000000.000000SMTNoSMT system with SRILM Language model

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1IITP-MTINDICml-en2018/09/14 19:51:532348---0.537780----0.0000000.000000NMTNoTransformer multilingual XX-En
2NICT-5INDICml-en2018/08/24 14:46:272106---0.528730----0.0000000.000000NMTNoBilingual transformer model
3cvitINDICml-en2019/03/22 04:46:022646---0.521060----0.0000000.000000NMTYesmany to en model (Transformer)
4cvitINDICml-en2019/03/22 05:42:392654---0.521060----0.0000000.000000NMTYesmay to en (Transformer) - detokenized
5cvitINDICml-en2019/03/14 22:03:452625---0.521020----0.0000000.000000NMTYesmassive-multi + ft
6cvitINDICml-en2019/03/23 12:40:102662---0.518530----0.0000000.000000NMTYesmassive-multi e270
7cvitINDICml-en2019/03/14 21:50:112617---0.507150----0.0000000.000000NMTYesmassive-multi
8NICT-5INDICml-en2018/09/07 14:30:022239---0.503940----0.0000000.000000NMTNoUnified Source vocabulary by orthography mapping. XX-EN model.
9NICT-5INDICml-en2018/08/24 14:47:032107---0.486070----0.0000000.000000NMTNoXX-En transformer model
10NICT-5INDICml-en2018/08/24 14:59:492131---0.481320----0.0000000.000000NMTNoXX-XX transformer model
11ORGANIZERINDICml-en2018/08/24 14:37:212100---0.481100----0.0000000.000000NMTNomulti2one multilingual NMT with Attention
12ORGANIZERINDICml-en2018/08/29 14:26:572191---0.476920----0.0000000.000000NMTNomulti2multi multilingual NMT with Attention
13ORGANIZERINDICml-en2018/08/20 11:14:382006---0.457220----0.0000000.000000NMTNoNMT with Attention
14RGNLPINDICml-en2018/09/15 03:16:462384---0.416980----0.0000000.000000NMTNoNMT system with a 2-layer LSTM method
15AnuvaadINDICml-en2018/09/15 18:48:332415---0.376260----0.0000000.000000SMTNoSMT with KenLM
16AnuvaadINDICml-en2018/09/15 18:01:532407---0.365770----0.0000000.000000SMTNoSMT XX-En
17RGNLPINDICml-en2018/09/15 02:49:572375---0.359220----0.0000000.000000SMTNoSMT system with SRILM Language model
18RGNLPINDICml-en2018/09/15 02:29:282368---0.356470----0.0000000.000000SMTNoSMT system with KENLM Language model

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02