NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1GenAIJPCN3ko-ja2024/08/01 18:00:27718067.0067.4066.90-------NMTNoTerminology based LLM Translator
2GenAIJPCN3ko-ja2024/08/06 20:36:49723462.8063.1062.70-------NMTNoTerminology based LLM Translator (mistral-nemo lora)
3GenAIJPCN3ko-ja2024/07/23 22:43:03714562.2062.5061.90-------NMTNoTerminology based LLM Translator
4GenAIJPCN3ko-ja2024/07/24 13:08:38714861.6062.5061.50-------NMTNoonly for test (by chatgpt)
5EHRJPCN3ko-ja2018/08/31 19:05:13221753.8355.8354.23----- 0.00 0.00NMTNoSMT reranked NMT
6KNU_HyundaiJPCN3ko-ja2019/07/27 00:12:31309653.5655.6854.02-------NMTNoTransformer + Relative Position + ensemble
7sarahJPCN3ko-ja2019/07/26 16:28:31301753.5955.6853.94-------NMTNoTransformer, ensemble of 4 models
8TMUJPCN3ko-ja2020/09/17 22:51:54383452.9055.0653.34-------NMTNoTransformer, domain adaptation (BERT Japanese), hanja loss, shared EMB, shared BPE, ensemble of 4 models
9TMUJPCN3ko-ja2020/10/13 09:48:09414052.8554.9253.24-------NMTNoTransformer, domain adaptation (BERT Korean), hanja loss, shared EMB, shared BPE, ensemble of 4 models
10TMUJPCN3ko-ja2020/10/13 10:05:00414452.7655.0453.24-------NMTNoTransformer, domain adaptation (BERT Korean), shared EMB, shared BPE, ensemble of 4 models
11TMUJPCN3ko-ja2020/09/17 22:53:20383552.7455.0453.23-------NMTNoTransformer, domain adaptation (BERT Japanese), shared EMB, shared BPE, ensemble of 4 models
12Bering LabJPCN3ko-ja2021/04/25 03:01:59551552.7454.5553.15-------NMTYesTransformer Ensemble with additional crawled parallel corpus
13sakuraJPCN3ko-ja2024/08/09 00:57:56731151.9054.1052.30-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
14ryanJPCN3ko-ja2019/07/23 12:03:36284051.7353.8952.16-------NMTNoBase Transformer
15ORGANIZERJPCN3ko-ja2018/08/15 16:26:48194852.0253.9351.99----- 0.00 0.00NMTNoNMT with Attention
16goku20JPCN3ko-ja2020/09/22 10:22:53412450.8353.2051.27-------NMTNomBART pre-training transformer, ensemble of 3 models
17sakuraJPCN3ko-ja2024/08/09 00:56:13731050.9053.1051.20-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
18goku20JPCN3ko-ja2020/09/21 12:27:10410149.8252.1550.32-------NMTNomBART pre-training transformer, single model
19tpt_watJPCN3ko-ja2021/04/27 02:09:34570149.3451.7649.83-------NMTNoBase Transformer model with joint vocab, size 8k

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1GenAIJPCN3ko-ja2024/08/01 18:00:2771800.9244740.9196570.923416-------NMTNoTerminology based LLM Translator
2GenAIJPCN3ko-ja2024/07/23 22:43:0371450.9163850.9121330.914275-------NMTNoTerminology based LLM Translator
3GenAIJPCN3ko-ja2024/07/24 13:08:3871480.9124820.9079320.911476-------NMTNoonly for test (by chatgpt)
4GenAIJPCN3ko-ja2024/08/06 20:36:4972340.9125920.9084970.910660-------NMTNoTerminology based LLM Translator (mistral-nemo lora)
5TMUJPCN3ko-ja2020/10/13 10:05:0041440.9061130.9031790.906320-------NMTNoTransformer, domain adaptation (BERT Korean), shared EMB, shared BPE, ensemble of 4 models
6TMUJPCN3ko-ja2020/10/13 09:48:0941400.9059730.9033870.906083-------NMTNoTransformer, domain adaptation (BERT Korean), hanja loss, shared EMB, shared BPE, ensemble of 4 models
7EHRJPCN3ko-ja2018/08/31 19:05:1322170.9073580.9038570.905654-----0.0000000.000000NMTNoSMT reranked NMT
8TMUJPCN3ko-ja2020/09/17 22:51:5438340.9050230.9030070.905224-------NMTNoTransformer, domain adaptation (BERT Japanese), hanja loss, shared EMB, shared BPE, ensemble of 4 models
9TMUJPCN3ko-ja2020/09/17 22:53:2038350.9048390.9028500.904985-------NMTNoTransformer, domain adaptation (BERT Japanese), shared EMB, shared BPE, ensemble of 4 models
10goku20JPCN3ko-ja2020/09/22 10:22:5341240.9049120.9018990.904054-------NMTNomBART pre-training transformer, ensemble of 3 models
11goku20JPCN3ko-ja2020/09/21 12:27:1041010.9034830.9004960.902749-------NMTNomBART pre-training transformer, single model
12Bering LabJPCN3ko-ja2021/04/25 03:01:5955150.9029840.8986270.902621-------NMTYesTransformer Ensemble with additional crawled parallel corpus
13sarahJPCN3ko-ja2019/07/26 16:28:3130170.9032110.9003130.902430-------NMTNoTransformer, ensemble of 4 models
14KNU_HyundaiJPCN3ko-ja2019/07/27 00:12:3130960.9016270.9000910.901877-------NMTNoTransformer + Relative Position + ensemble
15ryanJPCN3ko-ja2019/07/23 12:03:3628400.9024200.9003450.901521-------NMTNoBase Transformer
16tpt_watJPCN3ko-ja2021/04/27 02:09:3457010.8989200.8972290.899685-------NMTNoBase Transformer model with joint vocab, size 8k
17sakuraJPCN3ko-ja2024/08/09 00:57:5673110.8997810.8964890.898412-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
18ORGANIZERJPCN3ko-ja2018/08/15 16:26:4819480.8973480.8968970.898316-----0.0000000.000000NMTNoNMT with Attention
19sakuraJPCN3ko-ja2024/08/09 00:56:1373100.8828930.8803780.882207-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1Bering LabJPCN3ko-ja2021/04/25 03:01:5955150.9084630.9084630.908463-------NMTYesTransformer Ensemble with additional crawled parallel corpus
2tpt_watJPCN3ko-ja2021/04/27 02:09:3457010.8987320.8987320.898732-------NMTNoBase Transformer model with joint vocab, size 8k
3ORGANIZERJPCN3ko-ja2018/08/15 16:26:4819480.0000000.0000000.000000-----0.0000000.000000NMTNoNMT with Attention
4EHRJPCN3ko-ja2018/08/31 19:05:1322170.0000000.0000000.000000-----0.0000000.000000NMTNoSMT reranked NMT
5ryanJPCN3ko-ja2019/07/23 12:03:3628400.0000000.0000000.000000-------NMTNoBase Transformer
6sarahJPCN3ko-ja2019/07/26 16:28:3130170.0000000.0000000.000000-------NMTNoTransformer, ensemble of 4 models
7KNU_HyundaiJPCN3ko-ja2019/07/27 00:12:3130960.0000000.0000000.000000-------NMTNoTransformer + Relative Position + ensemble
8TMUJPCN3ko-ja2020/09/17 22:51:5438340.0000000.0000000.000000-------NMTNoTransformer, domain adaptation (BERT Japanese), hanja loss, shared EMB, shared BPE, ensemble of 4 models
9TMUJPCN3ko-ja2020/09/17 22:53:2038350.0000000.0000000.000000-------NMTNoTransformer, domain adaptation (BERT Japanese), shared EMB, shared BPE, ensemble of 4 models
10goku20JPCN3ko-ja2020/09/21 12:27:1041010.0000000.0000000.000000-------NMTNomBART pre-training transformer, single model
11goku20JPCN3ko-ja2020/09/22 10:22:5341240.0000000.0000000.000000-------NMTNomBART pre-training transformer, ensemble of 3 models
12TMUJPCN3ko-ja2020/10/13 09:48:0941400.0000000.0000000.000000-------NMTNoTransformer, domain adaptation (BERT Korean), hanja loss, shared EMB, shared BPE, ensemble of 4 models
13TMUJPCN3ko-ja2020/10/13 10:05:0041440.0000000.0000000.000000-------NMTNoTransformer, domain adaptation (BERT Korean), shared EMB, shared BPE, ensemble of 4 models
14GenAIJPCN3ko-ja2024/07/23 22:43:0371450.0000000.0000000.000000-------NMTNoTerminology based LLM Translator
15GenAIJPCN3ko-ja2024/07/24 13:08:3871480.0000000.0000000.000000-------NMTNoonly for test (by chatgpt)
16GenAIJPCN3ko-ja2024/08/01 18:00:2771800.0000000.0000000.000000-------NMTNoTerminology based LLM Translator
17GenAIJPCN3ko-ja2024/08/06 20:36:4972340.0000000.0000000.000000-------NMTNoTerminology based LLM Translator (mistral-nemo lora)
18sakuraJPCN3ko-ja2024/08/09 00:56:1373100.0000000.0000000.000000-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
19sakuraJPCN3ko-ja2024/08/09 00:57:5673110.0000000.0000000.000000-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1sarahJPCN3ko-ja2019/07/26 16:28:313017UnderwayNMTNoTransformer, ensemble of 4 models
2KNU_HyundaiJPCN3ko-ja2019/07/27 00:12:313096UnderwayNMTNoTransformer + Relative Position + ensemble

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02