NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1HW-TSCINDIC20en-te2020/09/19 17:28:484054------ 6.93---NMTNotransformer deep(pre LN),en2xx,PMI data and filtered PBI data (fasttext domain adaptation)+ sentencepiece + 4 model ensemble + adapter
2ORGANIZERINDIC20en-te2020/09/02 16:42:393630------ 5.20---NMTNoBaseline MLNMT En to XX model using PIB and Filtered PMI data. Transformer big model. Default settings.
3cvitINDIC20en-te2020/09/19 18:57:574061------ 5.12---NMTNoTransformer Multi-lingual baseline model. Encoder pre-training then fine-tuned on English-Telugu Parallel corpora
4cvitINDIC20en-te2020/09/19 18:45:544058------ 5.05---NMTNoTransformer Multi-Lingual Model, encoder pre-training on Telugu monolingual corpus then fine-tuned to English-Telugu translation
5NICT-5INDIC20en-te2020/09/18 21:27:494009------ 4.73---NMTNoXX to XX transformer model trained on officially provided PMI and PKB data. Corpora were size unbalanced.
6cvitINDIC20en-te2020/09/18 02:24:553859------ 4.39---NMTYes
7ODIANLPINDIC20en-te2020/09/17 01:51:373781------ 4.09---NMTNoTransformer Base with Relative position representations + en-xx model + PMI Data
8cvitINDIC20en-te2020/09/19 18:29:164056------ 4.00---NMTNoTransformer Multi-Lingual Model, encoder pre-training on Telugu monolingual corpus then fine-tuned to English-Telugu translation
9NICT-5INDIC20en-te2020/09/18 21:04:543996------ 3.54---NMTNoXX to XX transformer model trained on officially provided PMI and PKB data. Corpora were size balanced.
10Deterministic Algorithms LabINDIC20en-te2020/09/18 18:04:543939------ 2.50---NMTNoXLM Model with DAE Loss, MT Loss, MLM Loss, TLM loss and Back-Translation Loss.

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1cvitINDIC20en-te2020/09/19 18:45:544058------0.592138---NMTNoTransformer Multi-Lingual Model, encoder pre-training on Telugu monolingual corpus then fine-tuned to English-Telugu translation
2cvitINDIC20en-te2020/09/18 02:24:553859------0.582817---NMTYes
3cvitINDIC20en-te2020/09/19 18:57:574061------0.580369---NMTNoTransformer Multi-lingual baseline model. Encoder pre-training then fine-tuned on English-Telugu Parallel corpora
4NICT-5INDIC20en-te2020/09/18 21:27:494009------0.576601---NMTNoXX to XX transformer model trained on officially provided PMI and PKB data. Corpora were size unbalanced.
5HW-TSCINDIC20en-te2020/09/19 17:28:484054------0.573374---NMTNotransformer deep(pre LN),en2xx,PMI data and filtered PBI data (fasttext domain adaptation)+ sentencepiece + 4 model ensemble + adapter
6cvitINDIC20en-te2020/09/19 18:29:164056------0.572058---NMTNoTransformer Multi-Lingual Model, encoder pre-training on Telugu monolingual corpus then fine-tuned to English-Telugu translation
7ORGANIZERINDIC20en-te2020/09/02 16:42:393630------0.506301---NMTNoBaseline MLNMT En to XX model using PIB and Filtered PMI data. Transformer big model. Default settings.
8NICT-5INDIC20en-te2020/09/18 21:04:543996------0.502605---NMTNoXX to XX transformer model trained on officially provided PMI and PKB data. Corpora were size balanced.
9ODIANLPINDIC20en-te2020/09/17 01:51:373781------0.467728---NMTNoTransformer Base with Relative position representations + en-xx model + PMI Data
10Deterministic Algorithms LabINDIC20en-te2020/09/18 18:04:543939------0.464004---NMTNoXLM Model with DAE Loss, MT Loss, MLM Loss, TLM loss and Back-Translation Loss.

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1ORGANIZERINDIC20en-te2020/09/02 16:42:393630------0.000000---NMTNoBaseline MLNMT En to XX model using PIB and Filtered PMI data. Transformer big model. Default settings.
2ODIANLPINDIC20en-te2020/09/17 01:51:373781------0.000000---NMTNoTransformer Base with Relative position representations + en-xx model + PMI Data
3cvitINDIC20en-te2020/09/18 02:24:553859------0.000000---NMTYes
4Deterministic Algorithms LabINDIC20en-te2020/09/18 18:04:543939------0.000000---NMTNoXLM Model with DAE Loss, MT Loss, MLM Loss, TLM loss and Back-Translation Loss.
5NICT-5INDIC20en-te2020/09/18 21:04:543996------0.000000---NMTNoXX to XX transformer model trained on officially provided PMI and PKB data. Corpora were size balanced.
6NICT-5INDIC20en-te2020/09/18 21:27:494009------0.000000---NMTNoXX to XX transformer model trained on officially provided PMI and PKB data. Corpora were size unbalanced.
7HW-TSCINDIC20en-te2020/09/19 17:28:484054------0.000000---NMTNotransformer deep(pre LN),en2xx,PMI data and filtered PBI data (fasttext domain adaptation)+ sentencepiece + 4 model ensemble + adapter
8cvitINDIC20en-te2020/09/19 18:29:164056------0.000000---NMTNoTransformer Multi-Lingual Model, encoder pre-training on Telugu monolingual corpus then fine-tuned to English-Telugu translation
9cvitINDIC20en-te2020/09/19 18:45:544058------0.000000---NMTNoTransformer Multi-Lingual Model, encoder pre-training on Telugu monolingual corpus then fine-tuned to English-Telugu translation
10cvitINDIC20en-te2020/09/19 18:57:574061------0.000000---NMTNoTransformer Multi-lingual baseline model. Encoder pre-training then fine-tuned on English-Telugu Parallel corpora

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1ODIANLPINDIC20en-te2020/09/17 01:51:373781UnderwayNMTNoTransformer Base with Relative position representations + en-xx model + PMI Data
2cvitINDIC20en-te2020/09/18 02:24:553859UnderwayNMTYes
3HW-TSCINDIC20en-te2020/09/19 17:28:484054UnderwayNMTNotransformer deep(pre LN),en2xx,PMI data and filtered PBI data (fasttext domain adaptation)+ sentencepiece + 4 model ensemble + adapter
4cvitINDIC20en-te2020/09/19 18:45:544058UnderwayNMTNoTransformer Multi-Lingual Model, encoder pre-training on Telugu monolingual corpus then fine-tuned to English-Telugu translation

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02