NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1HwTscSUSOFTWAREms-en2022/07/11 13:19:396752---45.50------NMTNoXX to XX transformer model trained on GNOME,KDE4,Ubuntu as well as other data from OPUS and finetune on dev set
2sakuraSOFTWAREms-en2021/04/29 13:48:035823---40.97------NMTNoMultilingual finetuning of mBART50 finetuned many-to-many model, ensemble of 3
3Bering LabSOFTWAREms-en2021/04/16 12:15:315182---38.42------NMTNoTransformer trained on OPUS with GNOME,KDE4,Ubuntu weighted
4NICT-2SOFTWAREms-en2021/05/01 12:16:105905---38.42------NMTYesThe extended mBART model, mixed domain training with domain fine-tuning.
5JBJBJBSOFTWAREms-en2021/05/02 23:02:005980---36.19------NMTYesMBart50 fairseq
6NICT-5SOFTWAREms-en2020/09/18 20:39:093975---26.71------NMTNoXX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size unbalanced.
7NICT-2SOFTWAREms-en2021/05/01 12:03:045897---26.48------NMTNoTransformer base model, multilingual + mixed domain training with domain fine-tuning.
8NICT-5SOFTWAREms-en2020/09/18 19:09:053949---26.33------NMTNoXX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size balanced.
9ORGANIZERSOFTWAREms-en2020/09/01 15:49:383602---25.36------SMTNoBaseline MLNMT XX to En model using ALT, Ubuntu, GNOME and KDE4 data from opus. Transformer big model. Default settings.
10jyjySOFTWAREms-en2021/04/26 16:32:435642---15.37------NMTNo

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1sakuraSOFTWAREms-en2021/04/29 13:48:035823---0.819980------NMTNoMultilingual finetuning of mBART50 finetuned many-to-many model, ensemble of 3
2HwTscSUSOFTWAREms-en2022/07/11 13:19:396752---0.819582------NMTNoXX to XX transformer model trained on GNOME,KDE4,Ubuntu as well as other data from OPUS and finetune on dev set
3NICT-2SOFTWAREms-en2021/05/01 12:16:105905---0.818175------NMTYesThe extended mBART model, mixed domain training with domain fine-tuning.
4Bering LabSOFTWAREms-en2021/04/16 12:15:315182---0.807058------NMTNoTransformer trained on OPUS with GNOME,KDE4,Ubuntu weighted
5JBJBJBSOFTWAREms-en2021/05/02 23:02:005980---0.790488------NMTYesMBart50 fairseq
6NICT-2SOFTWAREms-en2021/05/01 12:03:045897---0.760911------NMTNoTransformer base model, multilingual + mixed domain training with domain fine-tuning.
7NICT-5SOFTWAREms-en2020/09/18 19:09:053949---0.752428------NMTNoXX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size balanced.
8NICT-5SOFTWAREms-en2020/09/18 20:39:093975---0.752213------NMTNoXX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size unbalanced.
9ORGANIZERSOFTWAREms-en2020/09/01 15:49:383602---0.749811------SMTNoBaseline MLNMT XX to En model using ALT, Ubuntu, GNOME and KDE4 data from opus. Transformer big model. Default settings.
10jyjySOFTWAREms-en2021/04/26 16:32:435642---0.655340------NMTNo

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1sakuraSOFTWAREms-en2021/04/29 13:48:035823---0.849354------NMTNoMultilingual finetuning of mBART50 finetuned many-to-many model, ensemble of 3
2NICT-2SOFTWAREms-en2021/05/01 12:16:105905---0.843418------NMTYesThe extended mBART model, mixed domain training with domain fine-tuning.
3JBJBJBSOFTWAREms-en2021/05/02 23:02:005980---0.820152------NMTYesMBart50 fairseq
4Bering LabSOFTWAREms-en2021/04/16 12:15:315182---0.810730------NMTNoTransformer trained on OPUS with GNOME,KDE4,Ubuntu weighted
5NICT-2SOFTWAREms-en2021/05/01 12:03:045897---0.777501------NMTNoTransformer base model, multilingual + mixed domain training with domain fine-tuning.
6ORGANIZERSOFTWAREms-en2020/09/01 15:49:383602---0.767758------SMTNoBaseline MLNMT XX to En model using ALT, Ubuntu, GNOME and KDE4 data from opus. Transformer big model. Default settings.
7jyjySOFTWAREms-en2021/04/26 16:32:435642---0.713430------NMTNo
8NICT-5SOFTWAREms-en2020/09/18 19:09:053949---0.000000------NMTNoXX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size balanced.
9NICT-5SOFTWAREms-en2020/09/18 20:39:093975---0.000000------NMTNoXX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size unbalanced.
10HwTscSUSOFTWAREms-en2022/07/11 13:19:396752---0.000000------NMTNoXX to XX transformer model trained on GNOME,KDE4,Ubuntu as well as other data from OPUS and finetune on dev set

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1HwTscSUSOFTWAREms-en2022/07/11 13:19:3967524.390NMTNoXX to XX transformer model trained on GNOME,KDE4,Ubuntu as well as other data from OPUS and finetune on dev set

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1sakuraSOFTWAREms-en2021/04/29 13:48:035823UnderwayNMTNoMultilingual finetuning of mBART50 finetuned many-to-many model, ensemble of 3
2NICT-2SOFTWAREms-en2021/05/01 12:16:105905UnderwayNMTYesThe extended mBART model, mixed domain training with domain fine-tuning.

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NICT-5SOFTWAREms-en2020/09/18 20:39:093975UnderwayNMTNoXX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size unbalanced.

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02