NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1ORGANIZERINDICur-en2018/08/20 11:25:272012--- 9.29---- 0.00 0.00NMTNoNMT with Attention
2ORGANIZERINDICur-en2018/08/24 14:40:532103---19.60---- 0.00 0.00NMTNomulti2one multilingual NMT with Attention
3NICT-5INDICur-en2018/08/24 14:52:202118---20.65---- 0.00 0.00NMTNoBilingual transformer model
4NICT-5INDICur-en2018/08/24 14:52:352119---27.88---- 0.00 0.00NMTNoXX-En transformer model
5NICT-5INDICur-en2018/08/24 15:01:222137---30.84---- 0.00 0.00NMTNoXX-XX transformer model
6ORGANIZERINDICur-en2018/08/29 14:35:462197---18.69---- 0.00 0.00NMTNomulti2multi multilingual NMT with Attention
7NICT-5INDICur-en2018/09/07 14:33:022243---26.73---- 0.00 0.00NMTNoUnified Source vocabulary by orthography mapping. XX-EN model.
8IITP-MTINDICur-en2018/09/14 19:57:552351---26.56---- 0.00 0.00NMTNoTransformer multilingual XX-En
9RGNLPINDICur-en2018/09/15 02:42:142372---14.88---- 0.00 0.00SMTNoSMT system with KENLM Language model
10RGNLPINDICur-en2018/09/15 02:54:012379---15.36---- 0.00 0.00SMTNoSMT system with SRILM Language model
11RGNLPINDICur-en2018/09/15 03:19:362386---10.65---- 0.00 0.00NMTNoNMT system with a 2-layer LSTM method
12AnuvaadINDICur-en2018/09/15 17:51:202401---18.03---- 0.00 0.00SMTNoSMT with KenLM
13AnuvaadINDICur-en2018/09/15 18:06:082411---18.31---- 0.00 0.00SMTNoSMT XX-En
14cvitINDICur-en2019/03/14 21:55:152621---21.03---- 0.00 0.00NMTYesmassive-multi
15cvitINDICur-en2019/03/14 22:12:492628---24.60---- 0.00 0.00NMTYesmassive-multi + ft
16cvitINDICur-en2019/03/22 05:32:542651---24.36---- 0.00 0.00NMTNomany to en (Transformer)
17cvitINDICur-en2019/03/22 05:47:102657---24.59---- 0.00 0.00NMTYesmay to en (Transformer) - detokenized
18cvitINDICur-en2019/03/23 12:53:222665---25.57---- 0.00 0.00NMTYesmassive-multi e270

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1ORGANIZERINDICur-en2018/08/20 11:25:272012---0.611354----0.0000000.000000NMTNoNMT with Attention
2ORGANIZERINDICur-en2018/08/24 14:40:532103---0.714075----0.0000000.000000NMTNomulti2one multilingual NMT with Attention
3NICT-5INDICur-en2018/08/24 14:52:202118---0.631710----0.0000000.000000NMTNoBilingual transformer model
4NICT-5INDICur-en2018/08/24 14:52:352119---0.705470----0.0000000.000000NMTNoXX-En transformer model
5NICT-5INDICur-en2018/08/24 15:01:222137---0.742376----0.0000000.000000NMTNoXX-XX transformer model
6ORGANIZERINDICur-en2018/08/29 14:35:462197---0.695015----0.0000000.000000NMTNomulti2multi multilingual NMT with Attention
7NICT-5INDICur-en2018/09/07 14:33:022243---0.696801----0.0000000.000000NMTNoUnified Source vocabulary by orthography mapping. XX-EN model.
8IITP-MTINDICur-en2018/09/14 19:57:552351---0.733161----0.0000000.000000NMTNoTransformer multilingual XX-En
9RGNLPINDICur-en2018/09/15 02:42:142372---0.605664----0.0000000.000000SMTNoSMT system with KENLM Language model
10RGNLPINDICur-en2018/09/15 02:54:012379---0.617035----0.0000000.000000SMTNoSMT system with SRILM Language model
11RGNLPINDICur-en2018/09/15 03:19:362386---0.619906----0.0000000.000000NMTNoNMT system with a 2-layer LSTM method
12AnuvaadINDICur-en2018/09/15 17:51:202401---0.630810----0.0000000.000000SMTNoSMT with KenLM
13AnuvaadINDICur-en2018/09/15 18:06:082411---0.635688----0.0000000.000000SMTNoSMT XX-En
14cvitINDICur-en2019/03/14 21:55:152621---0.719253----0.0000000.000000NMTYesmassive-multi
15cvitINDICur-en2019/03/14 22:12:492628---0.741251----0.0000000.000000NMTYesmassive-multi + ft
16cvitINDICur-en2019/03/22 05:32:542651---0.746742----0.0000000.000000NMTNomany to en (Transformer)
17cvitINDICur-en2019/03/22 05:47:102657---0.751459----0.0000000.000000NMTYesmay to en (Transformer) - detokenized
18cvitINDICur-en2019/03/23 12:53:222665---0.744835----0.0000000.000000NMTYesmassive-multi e270

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1ORGANIZERINDICur-en2018/08/20 11:25:272012---0.427350----0.0000000.000000NMTNoNMT with Attention
2ORGANIZERINDICur-en2018/08/24 14:40:532103---0.532100----0.0000000.000000NMTNomulti2one multilingual NMT with Attention
3NICT-5INDICur-en2018/08/24 14:52:202118---0.481800----0.0000000.000000NMTNoBilingual transformer model
4NICT-5INDICur-en2018/08/24 14:52:352119---0.555430----0.0000000.000000NMTNoXX-En transformer model
5NICT-5INDICur-en2018/08/24 15:01:222137---0.595450----0.0000000.000000NMTNoXX-XX transformer model
6ORGANIZERINDICur-en2018/08/29 14:35:462197---0.523780----0.0000000.000000NMTNomulti2multi multilingual NMT with Attention
7NICT-5INDICur-en2018/09/07 14:33:022243---0.561620----0.0000000.000000NMTNoUnified Source vocabulary by orthography mapping. XX-EN model.
8IITP-MTINDICur-en2018/09/14 19:57:552351---0.583910----0.0000000.000000NMTNoTransformer multilingual XX-En
9RGNLPINDICur-en2018/09/15 02:42:142372---0.537230----0.0000000.000000SMTNoSMT system with KENLM Language model
10RGNLPINDICur-en2018/09/15 02:54:012379---0.541650----0.0000000.000000SMTNoSMT system with SRILM Language model
11RGNLPINDICur-en2018/09/15 03:19:362386---0.468360----0.0000000.000000NMTNoNMT system with a 2-layer LSTM method
12AnuvaadINDICur-en2018/09/15 17:51:202401---0.541890----0.0000000.000000SMTNoSMT with KenLM
13AnuvaadINDICur-en2018/09/15 18:06:082411---0.519810----0.0000000.000000SMTNoSMT XX-En
14cvitINDICur-en2019/03/14 21:55:152621---0.544170----0.0000000.000000NMTYesmassive-multi
15cvitINDICur-en2019/03/14 22:12:492628---0.566190----0.0000000.000000NMTYesmassive-multi + ft
16cvitINDICur-en2019/03/22 05:32:542651---0.571530----0.0000000.000000NMTNomany to en (Transformer)
17cvitINDICur-en2019/03/22 05:47:102657---0.571530----0.0000000.000000NMTYesmay to en (Transformer) - detokenized
18cvitINDICur-en2019/03/23 12:53:222665---0.578870----0.0000000.000000NMTYesmassive-multi e270

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02