NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1SRPOLINDIC21en-ml2021/05/04 15:20:326236------15.49---NMTNoEnsemble of one-to-many on all data. Pretrained on BT, finetuned on PMI
2SRPOLINDIC21en-ml2021/05/04 16:26:166262------15.43---NMTNoOne-to-many on all data. Pretrained on BT, finetuned on PMI
3CFILTINDIC21en-ml2021/05/04 01:01:476046------12.79---NMTNoMultilingual(One-to-Many(En-XX)) NMT model based on Transformer with shared encoder and decoder.
4IIIT-HINDIC21en-ml2021/05/03 18:08:566009------12.76---NMTNoMNMT system (En-XX) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning.
5SRPOLINDIC21en-ml2021/04/21 19:20:165319------11.93---NMTNoBase transformer on all WAT21 data
6sakuraINDIC21en-ml2021/05/01 11:30:095886------10.94---NMTNoFine-tuning of multilingual mBART one2many model with training corpus.
7sakuraINDIC21en-ml2021/05/04 04:10:376154------ 8.13---NMTNoPre-training multilingual mBART one2many model with training corpus followed by finetuning on PMI Parallel.
8NICT-5INDIC21en-ml2021/06/25 11:38:076487------ 7.48---NMTNoUsing PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model.
9NICT-5INDIC21en-ml2021/04/22 11:52:075356------ 6.51---NMTNoMBART+MNMT. Beam 4.
10coastalINDIC21en-ml2021/05/04 01:38:176081------ 6.27---NMTNoseq2seq model trained on all WAT2021 data
11mcairtINDIC21en-ml2021/05/03 17:52:336002------ 6.17---NMTNomultilingual model(one to many model) trained on all WAT 2021 data by using base transformer.
12NICT-5INDIC21en-ml2021/04/21 15:43:185281------ 5.98---NMTNoPretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual.
13NLPHutINDIC21en-ml2021/03/19 16:16:464590------ 4.57---NMTNoTransformer with target language tag trained using all languages PMI data. Then fine-tuned using en-ml PMI data.
14IITP-MTINDIC21en-ml2021/05/04 17:56:026287------ 3.79---NMTNoOne-to-Many model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus.
15ORGANIZERINDIC21en-ml2021/04/08 17:22:554796------ 3.34---NMTNoBilingual baseline trained on PMI data. Transformer base. LR=10-3
16gaurvarINDIC21en-ml2021/04/25 19:58:415582------ 1.79---NMTNoMulti Task Multi Lingual T5 trained for Multiple Indic Languages
17gaurvarINDIC21en-ml2021/05/01 19:30:525930------ 1.48---NMTNoMulti Task Multi Lingual T5 trained for Multiple Indic Languages

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1SRPOLINDIC21en-ml2021/05/04 15:20:326236------0.736915---NMTNoEnsemble of one-to-many on all data. Pretrained on BT, finetuned on PMI
2SRPOLINDIC21en-ml2021/05/04 16:26:166262------0.734111---NMTNoOne-to-many on all data. Pretrained on BT, finetuned on PMI
3CFILTINDIC21en-ml2021/05/04 01:01:476046------0.707437---NMTNoMultilingual(One-to-Many(En-XX)) NMT model based on Transformer with shared encoder and decoder.
4SRPOLINDIC21en-ml2021/04/21 19:20:165319------0.687422---NMTNoBase transformer on all WAT21 data
5sakuraINDIC21en-ml2021/05/01 11:30:095886------0.686534---NMTNoFine-tuning of multilingual mBART one2many model with training corpus.
6IIIT-HINDIC21en-ml2021/05/03 18:08:566009------0.672331---NMTNoMNMT system (En-XX) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning.
7sakuraINDIC21en-ml2021/05/04 04:10:376154------0.663139---NMTNoPre-training multilingual mBART one2many model with training corpus followed by finetuning on PMI Parallel.
8NICT-5INDIC21en-ml2021/06/25 11:38:076487------0.641463---NMTNoUsing PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model.
9NICT-5INDIC21en-ml2021/04/22 11:52:075356------0.623301---NMTNoMBART+MNMT. Beam 4.
10mcairtINDIC21en-ml2021/05/03 17:52:336002------0.622598---NMTNomultilingual model(one to many model) trained on all WAT 2021 data by using base transformer.
11coastalINDIC21en-ml2021/05/04 01:38:176081------0.619774---NMTNoseq2seq model trained on all WAT2021 data
12NICT-5INDIC21en-ml2021/04/21 15:43:185281------0.605053---NMTNoPretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual.
13NLPHutINDIC21en-ml2021/03/19 16:16:464590------0.554478---NMTNoTransformer with target language tag trained using all languages PMI data. Then fine-tuned using en-ml PMI data.
14ORGANIZERINDIC21en-ml2021/04/08 17:22:554796------0.475441---NMTNoBilingual baseline trained on PMI data. Transformer base. LR=10-3
15IITP-MTINDIC21en-ml2021/05/04 17:56:026287------0.437679---NMTNoOne-to-Many model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus.
16gaurvarINDIC21en-ml2021/04/25 19:58:415582------0.338533---NMTNoMulti Task Multi Lingual T5 trained for Multiple Indic Languages
17gaurvarINDIC21en-ml2021/05/01 19:30:525930------0.306966---NMTNoMulti Task Multi Lingual T5 trained for Multiple Indic Languages

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1SRPOLINDIC21en-ml2021/05/04 16:26:166262------0.808089---NMTNoOne-to-many on all data. Pretrained on BT, finetuned on PMI
2SRPOLINDIC21en-ml2021/05/04 15:20:326236------0.807998---NMTNoEnsemble of one-to-many on all data. Pretrained on BT, finetuned on PMI
3CFILTINDIC21en-ml2021/05/04 01:01:476046------0.805291---NMTNoMultilingual(One-to-Many(En-XX)) NMT model based on Transformer with shared encoder and decoder.
4sakuraINDIC21en-ml2021/05/04 04:10:376154------0.801941---NMTNoPre-training multilingual mBART one2many model with training corpus followed by finetuning on PMI Parallel.
5SRPOLINDIC21en-ml2021/04/21 19:20:165319------0.798649---NMTNoBase transformer on all WAT21 data
6sakuraINDIC21en-ml2021/05/01 11:30:095886------0.794481---NMTNoFine-tuning of multilingual mBART one2many model with training corpus.
7mcairtINDIC21en-ml2021/05/03 17:52:336002------0.793308---NMTNomultilingual model(one to many model) trained on all WAT 2021 data by using base transformer.
8NICT-5INDIC21en-ml2021/04/22 11:52:075356------0.789337---NMTNoMBART+MNMT. Beam 4.
9coastalINDIC21en-ml2021/05/04 01:38:176081------0.784292---NMTNoseq2seq model trained on all WAT2021 data
10NICT-5INDIC21en-ml2021/04/21 15:43:185281------0.764924---NMTNoPretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual.
11IITP-MTINDIC21en-ml2021/05/04 17:56:026287------0.758960---NMTNoOne-to-Many model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus.
12IIIT-HINDIC21en-ml2021/05/03 18:08:566009------0.745043---NMTNoMNMT system (En-XX) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning.
13NLPHutINDIC21en-ml2021/03/19 16:16:464590------0.740136---NMTNoTransformer with target language tag trained using all languages PMI data. Then fine-tuned using en-ml PMI data.
14ORGANIZERINDIC21en-ml2021/04/08 17:22:554796------0.706782---NMTNoBilingual baseline trained on PMI data. Transformer base. LR=10-3
15gaurvarINDIC21en-ml2021/04/25 19:58:415582------0.666547---NMTNoMulti Task Multi Lingual T5 trained for Multiple Indic Languages
16gaurvarINDIC21en-ml2021/05/01 19:30:525930------0.656847---NMTNoMulti Task Multi Lingual T5 trained for Multiple Indic Languages
17NICT-5INDIC21en-ml2021/06/25 11:38:076487------0.000000---NMTNoUsing PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model.

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NLPHutINDIC21en-ml2021/03/19 16:16:464590UnderwayNMTNoTransformer with target language tag trained using all languages PMI data. Then fine-tuned using en-ml PMI data.
2NICT-5INDIC21en-ml2021/04/21 15:43:185281UnderwayNMTNoPretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual.
3NICT-5INDIC21en-ml2021/04/22 11:52:075356UnderwayNMTNoMBART+MNMT. Beam 4.
4gaurvarINDIC21en-ml2021/04/25 19:58:415582UnderwayNMTNoMulti Task Multi Lingual T5 trained for Multiple Indic Languages
5sakuraINDIC21en-ml2021/05/01 11:30:095886UnderwayNMTNoFine-tuning of multilingual mBART one2many model with training corpus.
6gaurvarINDIC21en-ml2021/05/01 19:30:525930UnderwayNMTNoMulti Task Multi Lingual T5 trained for Multiple Indic Languages
7mcairtINDIC21en-ml2021/05/03 17:52:336002UnderwayNMTNomultilingual model(one to many model) trained on all WAT 2021 data by using base transformer.
8IIIT-HINDIC21en-ml2021/05/03 18:08:566009UnderwayNMTNoMNMT system (En-XX) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning.
9CFILTINDIC21en-ml2021/05/04 01:01:476046UnderwayNMTNoMultilingual(One-to-Many(En-XX)) NMT model based on Transformer with shared encoder and decoder.
10coastalINDIC21en-ml2021/05/04 01:38:176081UnderwayNMTNoseq2seq model trained on all WAT2021 data
11SRPOLINDIC21en-ml2021/05/04 15:20:326236UnderwayNMTNoEnsemble of one-to-many on all data. Pretrained on BT, finetuned on PMI
12SRPOLINDIC21en-ml2021/05/04 16:26:166262UnderwayNMTNoOne-to-many on all data. Pretrained on BT, finetuned on PMI
13IITP-MTINDIC21en-ml2021/05/04 17:56:026287UnderwayNMTNoOne-to-Many model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus.

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02