NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1JAPIOJPCzh-ja2016/08/17 11:48:56116158.6659.1958.63---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + JAPIO corpus including some sentences in testset
2KNU_HyundaiJPCzh-ja2019/07/27 08:29:23315353.5953.9153.47-------NMTYesTransformer(base) + *Used ASPEC corpus* with relative position, bt, multi source, r2l rerank, 5-model ensemble
3Bering LabJPCzh-ja2021/05/04 06:53:17617152.9953.5753.03-------NMTYesTransformer Ensemble with additional crawled parallel corpus
4sakuraJPCzh-ja2024/08/08 19:35:56726452.6053.4052.80-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
5sakuraJPCzh-ja2024/08/09 00:24:08729652.4053.2052.50-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
6sarahJPCzh-ja2019/07/26 11:32:28297650.9052.0451.09-------NMTNoTransformer, ensemble of 4 models
7JAPIOJPCzh-ja2017/07/25 12:22:07144750.5251.2550.57---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Japio corpus
8JAPIOJPCzh-ja2017/07/26 14:21:18148450.0650.5150.00---- 0.00 0.00 0.00NMTYesCombination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
9ryanJPCzh-ja2019/07/25 22:12:26295449.9850.6750.12-------NMTNoBase Transformer
10JAPIOJPCzh-ja2017/07/26 14:09:22148249.5150.0049.48---- 0.00 0.00 0.00NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
11tpt_watJPCzh-ja2021/04/27 01:42:09569049.4250.4649.55-------NMTNoBase Transformer model with shared vocab 8k size
12goku20JPCzh-ja2020/09/21 11:54:22408749.0750.0449.34-------NMTNomBART pre-training transformer, single model
13goku20JPCzh-ja2020/09/22 00:04:26410548.9849.9249.21-------NMTNomBART pre-training transformer, ensemble of 3 models
14USTCJPCzh-ja2018/08/31 17:24:35220648.3749.7848.57----- 0.00 0.00NMTNotensor2tensor, 4 model average, r2l rerank
15EHRJPCzh-ja2018/08/31 18:51:15221048.1048.5147.96----- 0.00 0.00NMTNoSMT reranked NMT
16EHRJPCzh-ja2017/07/19 19:28:31140847.0847.4446.83---- 0.00 0.00 0.00NMTNoSMT reranked NMT (word based, by Moses and OpenNMT)
17EHRJPCzh-ja2017/07/19 20:41:27141446.5247.1746.35---- 0.00 0.00 0.00NMTNoSMT reranked NMT (character based, by Moses and OpenNMT)
18ORGANIZERJPCzh-ja2018/08/15 18:29:31196346.3246.7346.11----- 0.00 0.00NMTNoNMT with Attention
19EHRJPCzh-ja2017/07/19 20:45:00141546.0346.4245.95---- 0.00 0.00 0.00NMTNoSimple NMT (word based, by OpenNMT)
20EHRJPCzh-ja2017/07/19 19:35:03140945.2745.8745.24---- 0.00 0.00 0.00NMTNoSimple NMT (character based, by OpenNMT)
21JAPIOJPCzh-ja2017/07/25 18:26:52145845.0745.7945.10---- 0.00 0.00 0.00NMTNoOpenNMT(dbrnn)
22NTTJPCzh-ja2016/08/19 08:53:34119944.9945.8445.02---- 0.00 0.00 0.00NMTNoBaseline NMT with attention over bidirectional LSTMs (by Harvard NMT)
23JAPIOJPCzh-ja2016/08/19 08:26:57119244.3245.1244.09---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + JAPIO corpus
24JAPIOJPCzh-ja2016/08/18 14:15:46118043.8744.4743.66---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + JAPIO corpus
25NTTJPCzh-ja2016/08/19 08:55:20120043.4744.2743.53---- 0.00 0.00 0.00NMTNoNMT with pre-ordering and attention over bidirectional LSTMs (pre-ordering module is the same as the PBMT submission)
26ORGANIZERJPCzh-ja2016/11/16 11:19:58134142.6643.7642.95---- 0.00 0.00 0.00NMTYesOnline A (2016/11/14)
27NICT-2JPCzh-ja2016/08/05 18:06:47110041.8742.3942.13---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
28TOSHIBAJPCzh-ja2015/07/23 14:43:3050441.8241.9041.60--- 0.00 0.00 0.00 0.00SMT and RBMTYesCombination of phrase-based SMT and SPE systems.
29Kyoto-UJPCzh-ja2015/09/02 09:25:0486441.3541.9241.16--- 0.00 0.00 0.00 0.00EBMTNoKyotoEBMT system with bilingual RNNLM reranking (only character-base model)
30TOSHIBAJPCzh-ja2015/07/28 16:30:4152641.1240.8740.59--- 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
31NICT-2JPCzh-ja2016/08/04 17:34:38107941.0941.2741.24---- 0.00 0.00 0.00SMTNoPhrase-based SMT with Preordering + Domain Adaptation
32EHRJPCzh-ja2015/08/17 14:05:2067141.0642.2441.15--- 0.00 0.00 0.00 0.00SMT and RBMTYesSystem combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering. Candidate selection by language model score.
33EHRJPCzh-ja2016/07/18 15:33:03100941.0541.0540.52---- 0.00 0.00 0.00SMT and RBMTYesCombination of word-based PBSMT, character-based PBSMT and RBMT+PBSPE with DL=6.
34EHRJPCzh-ja2016/07/18 15:25:53100740.9541.2040.51---- 0.00 0.00 0.00SMTYesCombination of word-based PBSMT and character-based PBSMT with DL=6.
35NTTJPCzh-ja2016/08/19 08:28:00119340.7541.0540.68---- 0.00 0.00 0.00SMTNoPBMT with pre-ordering on dependency structures
36EHRJPCzh-ja2015/08/30 15:22:2583040.7041.4940.79--- 0.00 0.00 0.00 0.00SMTYesPhrase based SMT with preordering
37NTTJPCzh-ja2015/08/21 08:07:1873640.6041.1040.63--- 0.00 0.00 0.00 0.00SMTNoA pre-ordering-based PBMT with patent-tuned dependency parsing and phrase table smoothing.
38EHRJPCzh-ja2015/08/30 12:42:5282840.3540.1639.92--- 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with user dictionary plus SPE
39NTTJPCzh-ja2015/08/28 09:53:2481139.7740.0839.88--- 0.00 0.00 0.00 0.00SMTNoA pre-ordering-based PBMT with patent-tuned dependency parsing, learning-based pre-ordering, and phrase table smoothing.
40ORGANIZERJPCzh-ja2015/05/14 18:00:1643239.3939.9039.39--- 0.00 0.00 0.00 0.00SMTNoTree-to-String SMT (2015)
41bjtu_nlpJPCzh-ja2016/08/09 18:44:56112839.3439.7239.30---- 0.00 0.00 0.00NMTNoRNN Encoder-Decoder with attention mechanism, single model
42JAPIOJPCzh-ja2016/10/27 13:01:42132939.2940.6739.51---- 0.00 0.00 0.00SMTNoPhrase-based SMT with Preordering
43ORGANIZERJPCzh-ja2015/05/14 17:55:5143039.2239.5239.14--- 0.00 0.00 0.00 0.00SMTNoHierarchical Phrase-based SMT
44ORGANIZERJPCzh-ja2016/07/15 11:22:3599839.0739.4538.95---- 0.00 0.00 0.00SMTNoTree-to-String SMT (2016)
45NTTJPCzh-ja2016/08/19 08:26:18119139.0339.1738.99---- 0.00 0.00 0.00SMTNoBaseline PBMT (Moses)
46SenseJPCzh-ja2016/08/29 01:06:19128138.9038.5838.65---- 0.00 0.00 0.00SMTNoClustercat-C10-PBMT
47u-tkbJPCzh-ja2017/07/26 12:44:18146838.7940.4738.99---- 0.00 0.00 0.00NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
48SenseJPCzh-ja2016/08/29 09:55:33128438.7538.3238.38---- 0.00 0.00 0.00SMTNoBaseline-C10-PBMT
49SenseJPCzh-ja2016/08/29 23:08:29129238.7138.3538.38---- 0.00 0.00 0.00SMTNoBaseline-C50-PBMT
50SenseJPCzh-ja2016/08/30 07:37:39129438.7138.5138.54---- 0.00 0.00 0.00SMTNoClustercat-C50-PBMT
51ORGANIZERJPCzh-ja2015/05/14 17:58:1443138.3438.5138.22--- 0.00 0.00 0.00 0.00SMTNoPhrase-based SMT
52Kyoto-UJPCzh-ja2015/08/26 13:10:4478137.8738.6237.71--- 0.00 0.00 0.00 0.00EBMTNoBaseline w/o reranking
53WASUIPSJPCzh-ja2015/09/01 14:16:1685333.4834.5533.55--- 0.00 0.00 0.00 0.00SMTNoCombining sampling-based alignment and bilingual hierarchical sub-sentential alignment methods.
54WASUIPSJPCzh-ja2016/10/12 21:06:36132631.0031.6330.86---- 0.00 0.00 0.00SMTNoOur improved system: train=100,000 tune=1,000 test=2,000 BLEU (our PC): 33.61. Using bilingual term extraction and re-tokenization for Chinese–Japanese.
55WASUIPSJPCzh-ja2016/10/12 21:04:52132529.3830.8229.66---- 0.00 0.00 0.00SMTNoOur baseline system: train=100,000 tune=1,000 test=2,000 BLEU (our PC): 32.29.
56TOSHIBAJPCzh-ja2015/08/17 11:53:3466728.0627.4427.56--- 0.00 0.00 0.00 0.00RBMTYesRBMT
57ORGANIZERJPCzh-ja2016/07/26 11:18:45104026.9927.9127.02---- 0.00 0.00 0.00OtherYesOnline A (2016)
58ORGANIZERJPCzh-ja2015/08/14 16:52:0264726.8027.8126.89--- 0.00 0.00 0.00 0.00OtherYesOnline A (2015)
59EHRJPCzh-ja2018/05/04 14:17:39180315.7716.1215.53---- 0.00 0.00 0.00RBMTYesRBMT system for WAT2015's submission
60ORGANIZERJPCzh-ja2015/08/14 16:55:1964812.3312.7212.44--- 0.00 0.00 0.00 0.00OtherYesOnline B (2015)
61ORGANIZERJPCzh-ja2015/08/25 11:42:0275910.4910.7210.35--- 0.00 0.00 0.00 0.00RBMTNoRBMT A (2015)
62ORGANIZERJPCzh-ja2015/08/25 11:53:50760 7.94 8.07 7.73--- 0.00 0.00 0.00 0.00RBMTNoRBMT B

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1Bering LabJPCzh-ja2021/05/04 06:53:1761710.8837440.8817100.883204-------NMTYesTransformer Ensemble with additional crawled parallel corpus
2KNU_HyundaiJPCzh-ja2019/07/27 08:29:2331530.8802510.8794520.879471-------NMTYesTransformer(base) + *Used ASPEC corpus* with relative position, bt, multi source, r2l rerank, 5-model ensemble
3sakuraJPCzh-ja2024/08/09 00:24:0872960.8788510.8764520.878847-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
4sakuraJPCzh-ja2024/08/08 19:35:5672640.8771800.8753740.876600-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
5JAPIOJPCzh-ja2017/07/26 14:21:1814840.8753980.8733900.874822----0.0000000.0000000.000000NMTYesCombination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
6JAPIOJPCzh-ja2017/07/26 14:09:2214820.8726250.8705370.872038----0.0000000.0000000.000000NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
7ryanJPCzh-ja2019/07/25 22:12:2629540.8698900.8679190.868936-------NMTNoBase Transformer
8sarahJPCzh-ja2019/07/26 11:32:2829760.8691590.8671330.868496-------NMTNoTransformer, ensemble of 4 models
9tpt_watJPCzh-ja2021/04/27 01:42:0956900.8690340.8676780.868716-------NMTNoBase Transformer model with shared vocab 8k size
10JAPIOJPCzh-ja2016/08/17 11:48:5611610.8680270.8648930.866692----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus including some sentences in testset
11USTCJPCzh-ja2018/08/31 17:24:3522060.8662320.8642840.865423-----0.0000000.000000NMTNotensor2tensor, 4 model average, r2l rerank
12goku20JPCzh-ja2020/09/21 11:54:2240870.8658520.8637740.865703-------NMTNomBART pre-training transformer, single model
13goku20JPCzh-ja2020/09/22 00:04:2641050.8651110.8632060.864862-------NMTNomBART pre-training transformer, ensemble of 3 models
14JAPIOJPCzh-ja2017/07/25 18:26:5214580.8598830.8570560.859411----0.0000000.0000000.000000NMTNoOpenNMT(dbrnn)
15EHRJPCzh-ja2017/07/19 20:41:2714140.8596190.8567840.858353----0.0000000.0000000.000000NMTNoSMT reranked NMT (character based, by Moses and OpenNMT)
16EHRJPCzh-ja2017/07/19 19:28:3114080.8590700.8563760.858888----0.0000000.0000000.000000NMTNoSMT reranked NMT (word based, by Moses and OpenNMT)
17EHRJPCzh-ja2017/07/19 20:45:0014150.8585910.8559170.858511----0.0000000.0000000.000000NMTNoSimple NMT (word based, by OpenNMT)
18EHRJPCzh-ja2018/08/31 18:51:1522100.8582590.8556490.858142-----0.0000000.000000NMTNoSMT reranked NMT
19ORGANIZERJPCzh-ja2018/08/15 18:29:3119630.8573180.8550850.856442-----0.0000000.000000NMTNoNMT with Attention
20EHRJPCzh-ja2017/07/19 19:35:0314090.8544470.8526150.853226----0.0000000.0000000.000000NMTNoSimple NMT (character based, by OpenNMT)
21NTTJPCzh-ja2016/08/19 08:53:3411990.8530040.8518590.852430----0.0000000.0000000.000000NMTNoBaseline NMT with attention over bidirectional LSTMs (by Harvard NMT)
22JAPIOJPCzh-ja2017/07/25 12:22:0714470.8477930.8437740.846081----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Japio corpus
23ORGANIZERJPCzh-ja2016/11/16 11:19:5813410.8458580.8449180.845794----0.0000000.0000000.000000NMTYesOnline A (2016/11/14)
24NTTJPCzh-ja2016/08/19 08:55:2012000.8452710.8431050.844968----0.0000000.0000000.000000NMTNoNMT with pre-ordering and attention over bidirectional LSTMs (pre-ordering module is the same as the PBMT submission)
25bjtu_nlpJPCzh-ja2016/08/09 18:44:5611280.8353140.8305050.833216----0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
26JAPIOJPCzh-ja2016/08/19 08:26:5711920.8349590.8301640.832955----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus
27JAPIOJPCzh-ja2016/08/18 14:15:4611800.8335860.8293600.831534----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus
28u-tkbJPCzh-ja2017/07/26 12:44:1814680.8321440.8336100.831209----0.0000000.0000000.000000NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
29NICT-2JPCzh-ja2016/08/05 18:06:4711000.8296400.8267440.828107----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
30Kyoto-UJPCzh-ja2015/09/02 09:25:048640.8285430.8241990.827230---0.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking (only character-base model)
31EHRJPCzh-ja2016/07/18 15:25:5310070.8280400.8245020.826864----0.0000000.0000000.000000SMTYesCombination of word-based PBSMT and character-based PBSMT with DL=6.
32EHRJPCzh-ja2016/07/18 15:33:0310090.8270480.8219400.824852----0.0000000.0000000.000000SMT and RBMTYesCombination of word-based PBSMT, character-based PBSMT and RBMT+PBSPE with DL=6.
33NICT-2JPCzh-ja2016/08/04 17:34:3810790.8270090.8226640.825323----0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering + Domain Adaptation
34EHRJPCzh-ja2015/08/17 14:05:206710.8269870.8219830.825056---0.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering. Candidate selection by language model score.
35NTTJPCzh-ja2016/08/19 08:28:0011930.8259850.8221250.824840----0.0000000.0000000.000000SMTNoPBMT with pre-ordering on dependency structures
36EHRJPCzh-ja2015/08/30 15:22:258300.8242640.8210550.823192---0.0000000.0000000.0000000.000000SMTYesPhrase based SMT with preordering
37NTTJPCzh-ja2015/08/21 08:07:187360.8234360.8202520.822026---0.0000000.0000000.0000000.000000SMTNoA pre-ordering-based PBMT with patent-tuned dependency parsing and phrase table smoothing.
38TOSHIBAJPCzh-ja2015/07/28 16:30:415260.8222680.8142490.818981---0.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
39TOSHIBAJPCzh-ja2015/07/23 14:43:305040.8205680.8135360.817614---0.0000000.0000000.0000000.000000SMT and RBMTYesCombination of phrase-based SMT and SPE systems.
40JAPIOJPCzh-ja2016/10/27 13:01:4213290.8203390.8173520.819850----0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering
41EHRJPCzh-ja2015/08/30 12:42:528280.8195160.8129820.816743---0.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with user dictionary plus SPE
42NTTJPCzh-ja2015/08/28 09:53:248110.8162880.8119110.815543---0.0000000.0000000.0000000.000000SMTNoA pre-ordering-based PBMT with patent-tuned dependency parsing, learning-based pre-ordering, and phrase table smoothing.
43ORGANIZERJPCzh-ja2015/05/14 18:00:164320.8149190.8113500.813595---0.0000000.0000000.0000000.000000SMTNoTree-to-String SMT (2015)
44ORGANIZERJPCzh-ja2016/07/15 11:22:359980.8131350.8098930.811644----0.0000000.0000000.000000SMTNoTree-to-String SMT (2016)
45ORGANIZERJPCzh-ja2015/05/14 17:55:514300.8060580.8020590.804523---0.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT
46NTTJPCzh-ja2016/08/19 08:26:1811910.8057020.7979910.802998----0.0000000.0000000.000000SMTNoBaseline PBMT (Moses)
47SenseJPCzh-ja2016/08/29 09:55:3312840.8046730.7954490.801496----0.0000000.0000000.000000SMTNoBaseline-C10-PBMT
48SenseJPCzh-ja2016/08/30 07:37:3912940.8043010.7963490.801596----0.0000000.0000000.000000SMTNoClustercat-C50-PBMT
49SenseJPCzh-ja2016/08/29 01:06:1912810.8031550.7946790.800689----0.0000000.0000000.000000SMTNoClustercat-C10-PBMT
50SenseJPCzh-ja2016/08/29 23:08:2912920.8026730.7941270.799531----0.0000000.0000000.000000SMTNoBaseline-C50-PBMT
51Kyoto-UJPCzh-ja2015/08/26 13:10:447810.7997300.7977000.798979---0.0000000.0000000.0000000.000000EBMTNoBaseline w/o reranking
52ORGANIZERJPCzh-ja2015/05/14 17:58:144310.7820190.7789210.781456---0.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
53WASUIPSJPCzh-ja2015/09/01 14:16:168530.7739850.7710990.772202---0.0000000.0000000.0000000.000000SMTNoCombining sampling-based alignment and bilingual hierarchical sub-sentential alignment methods.
54TOSHIBAJPCzh-ja2015/08/17 11:53:346670.7720540.7587560.767076---0.0000000.0000000.0000000.000000RBMTYesRBMT
55WASUIPSJPCzh-ja2016/10/12 21:06:3613260.7546260.7518040.753376----0.0000000.0000000.000000SMTNoOur improved system: train=100,000 tune=1,000 test=2,000 BLEU (our PC): 33.61. Using bilingual term extraction and re-tokenization for Chinese–Japanese.
56WASUIPSJPCzh-ja2016/10/12 21:04:5213250.7518470.7484740.750678----0.0000000.0000000.000000SMTNoOur baseline system: train=100,000 tune=1,000 test=2,000 BLEU (our PC): 32.29.
57EHRJPCzh-ja2018/05/04 14:17:3918030.7217150.7100850.716303----0.0000000.0000000.000000RBMTYesRBMT system for WAT2015's submission
58ORGANIZERJPCzh-ja2015/08/14 16:52:026470.7122420.7072640.711273---0.0000000.0000000.0000000.000000OtherYesOnline A (2015)
59ORGANIZERJPCzh-ja2016/07/26 11:18:4510400.7077390.7027180.706707----0.0000000.0000000.000000OtherYesOnline A (2016)
60ORGANIZERJPCzh-ja2015/08/25 11:42:027590.6740600.6640980.667349---0.0000000.0000000.0000000.000000RBMTNoRBMT A (2015)
61ORGANIZERJPCzh-ja2015/08/14 16:55:196480.6489960.6412550.648742---0.0000000.0000000.0000000.000000OtherYesOnline B (2015)
62ORGANIZERJPCzh-ja2015/08/25 11:53:507600.5962000.5818370.586941---0.0000000.0000000.0000000.000000RBMTNoRBMT B

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1Bering LabJPCzh-ja2021/05/04 06:53:1761710.8964050.8964050.896405-------NMTYesTransformer Ensemble with additional crawled parallel corpus
2tpt_watJPCzh-ja2021/04/27 01:42:0956900.8869180.8869180.886918-------NMTNoBase Transformer model with shared vocab 8k size
3JAPIOJPCzh-ja2016/08/17 11:48:5611610.8080900.8080900.808090----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus including some sentences in testset
4JAPIOJPCzh-ja2017/07/26 14:21:1814840.7794200.7794200.779420----0.0000000.0000000.000000NMTYesCombination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
5JAPIOJPCzh-ja2017/07/26 14:09:2214820.7774600.7774600.777460----0.0000000.0000000.000000NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
6JAPIOJPCzh-ja2017/07/25 12:22:0714470.7746600.7746600.774660----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Japio corpus
7USTCJPCzh-ja2018/08/31 17:24:3522060.7713100.7713100.771310-----0.0000000.000000NMTNotensor2tensor, 4 model average, r2l rerank
8EHRJPCzh-ja2018/08/31 18:51:1522100.7646700.7646700.764670-----0.0000000.000000NMTNoSMT reranked NMT
9ORGANIZERJPCzh-ja2018/08/15 18:29:3119630.7618200.7618200.761820-----0.0000000.000000NMTNoNMT with Attention
10EHRJPCzh-ja2017/07/19 20:41:2714140.7613700.7613700.761370----0.0000000.0000000.000000NMTNoSMT reranked NMT (character based, by Moses and OpenNMT)
11EHRJPCzh-ja2017/07/19 19:35:0314090.7571300.7571300.757130----0.0000000.0000000.000000NMTNoSimple NMT (character based, by OpenNMT)
12EHRJPCzh-ja2017/07/19 19:28:3114080.7563500.7563500.756350----0.0000000.0000000.000000NMTNoSMT reranked NMT (word based, by Moses and OpenNMT)
13EHRJPCzh-ja2017/07/19 20:45:0014150.7559000.7559000.755900----0.0000000.0000000.000000NMTNoSimple NMT (word based, by OpenNMT)
14JAPIOJPCzh-ja2017/07/25 18:26:5214580.7549700.7549700.754970----0.0000000.0000000.000000NMTNoOpenNMT(dbrnn)
15NTTJPCzh-ja2016/08/19 08:53:3411990.7522000.7522000.752200----0.0000000.0000000.000000NMTNoBaseline NMT with attention over bidirectional LSTMs (by Harvard NMT)
16JAPIOJPCzh-ja2016/08/19 08:26:5711920.7512000.7512000.751200----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus
17NTTJPCzh-ja2016/08/19 08:55:2012000.7492700.7492700.749270----0.0000000.0000000.000000NMTNoNMT with pre-ordering and attention over bidirectional LSTMs (pre-ordering module is the same as the PBMT submission)
18JAPIOJPCzh-ja2016/08/18 14:15:4611800.7483300.7483300.748330----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus
19ORGANIZERJPCzh-ja2016/11/16 11:19:5813410.7472400.7472400.747240----0.0000000.0000000.000000NMTYesOnline A (2016/11/14)
20EHRJPCzh-ja2016/07/18 15:25:5310070.7450800.7450800.745080----0.0000000.0000000.000000SMTYesCombination of word-based PBSMT and character-based PBSMT with DL=6.
21Kyoto-UJPCzh-ja2015/09/02 09:25:048640.7441900.7441900.7441900.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking (only character-base model)
22TOSHIBAJPCzh-ja2015/07/28 16:30:415260.7419900.7419900.7419900.0000000.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
23TOSHIBAJPCzh-ja2015/07/23 14:43:305040.7401800.7401800.7401800.0000000.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesCombination of phrase-based SMT and SPE systems.
24NICT-2JPCzh-ja2016/08/05 18:06:4711000.7398900.7398900.739890----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
25EHRJPCzh-ja2016/07/18 15:33:0310090.7350100.7350100.735010----0.0000000.0000000.000000SMT and RBMTYesCombination of word-based PBSMT, character-based PBSMT and RBMT+PBSPE with DL=6.
26JAPIOJPCzh-ja2016/10/27 13:01:4213290.7333000.7333000.733300----0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering
27NICT-2JPCzh-ja2016/08/04 17:34:3810790.7330200.7330200.733020----0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering + Domain Adaptation
28NTTJPCzh-ja2015/08/21 08:07:187360.7324500.7324500.7324500.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoA pre-ordering-based PBMT with patent-tuned dependency parsing and phrase table smoothing.
29Kyoto-UJPCzh-ja2015/08/26 13:10:447810.7314200.7314200.7314200.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoBaseline w/o reranking
30NTTJPCzh-ja2016/08/19 08:28:0011930.7301900.7301900.730190----0.0000000.0000000.000000SMTNoPBMT with pre-ordering on dependency structures
31u-tkbJPCzh-ja2017/07/26 12:44:1814680.7295800.7295800.729580----0.0000000.0000000.000000NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
32ORGANIZERJPCzh-ja2015/05/14 17:55:514300.7293700.7293700.7293700.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT
33ORGANIZERJPCzh-ja2016/07/15 11:22:359980.7285200.7285200.728520----0.0000000.0000000.000000SMTNoTree-to-String SMT (2016)
34ORGANIZERJPCzh-ja2015/05/14 18:00:164320.7259200.7259200.7259200.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTree-to-String SMT (2015)
35NTTJPCzh-ja2015/08/28 09:53:248110.7232000.7232000.7232000.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoA pre-ordering-based PBMT with patent-tuned dependency parsing, learning-based pre-ordering, and phrase table smoothing.
36ORGANIZERJPCzh-ja2015/05/14 17:58:144310.7231100.7231100.7231100.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
37bjtu_nlpJPCzh-ja2016/08/09 18:44:5611280.7214600.7214600.721460----0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
38EHRJPCzh-ja2015/08/17 14:05:206710.7214000.7214000.7214000.0000000.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering. Candidate selection by language model score.
39NTTJPCzh-ja2016/08/19 08:26:1811910.7202600.7202600.720260----0.0000000.0000000.000000SMTNoBaseline PBMT (Moses)
40SenseJPCzh-ja2016/08/29 23:08:2912920.7193300.7193300.719330----0.0000000.0000000.000000SMTNoBaseline-C50-PBMT
41SenseJPCzh-ja2016/08/29 01:06:1912810.7185900.7185900.718590----0.0000000.0000000.000000SMTNoClustercat-C10-PBMT
42SenseJPCzh-ja2016/08/29 09:55:3312840.7154500.7154500.715450----0.0000000.0000000.000000SMTNoBaseline-C10-PBMT
43SenseJPCzh-ja2016/08/30 07:37:3912940.7152900.7152900.715290----0.0000000.0000000.000000SMTNoClustercat-C50-PBMT
44WASUIPSJPCzh-ja2015/09/01 14:16:168530.7097000.7097000.7097000.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoCombining sampling-based alignment and bilingual hierarchical sub-sentential alignment methods.
45EHRJPCzh-ja2015/08/30 15:22:258300.7065500.7065500.7065500.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTYesPhrase based SMT with preordering
46EHRJPCzh-ja2015/08/30 12:42:528280.7018800.7018800.7018800.0000000.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with user dictionary plus SPE
47ORGANIZERJPCzh-ja2015/08/14 16:52:026470.6938400.6938400.6938400.0000000.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline A (2015)
48ORGANIZERJPCzh-ja2016/07/26 11:18:4510400.6937200.6937200.693720----0.0000000.0000000.000000OtherYesOnline A (2016)
49WASUIPSJPCzh-ja2016/10/12 21:06:3613260.6860300.6860300.686030----0.0000000.0000000.000000SMTNoOur improved system: train=100,000 tune=1,000 test=2,000 BLEU (our PC): 33.61. Using bilingual term extraction and re-tokenization for Chinese–Japanese.
50WASUIPSJPCzh-ja2016/10/12 21:04:5213250.6791100.6791100.679110----0.0000000.0000000.000000SMTNoOur baseline system: train=100,000 tune=1,000 test=2,000 BLEU (our PC): 32.29.
51TOSHIBAJPCzh-ja2015/08/17 11:53:346670.6687800.6687800.6687800.0000000.0000000.0000000.0000000.0000000.0000000.000000RBMTYesRBMT
52EHRJPCzh-ja2018/05/04 14:17:3918030.6233000.6233000.623300----0.0000000.0000000.000000RBMTYesRBMT system for WAT2015's submission
53ORGANIZERJPCzh-ja2015/08/14 16:55:196480.5883800.5883800.5883800.0000000.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline B (2015)
54ORGANIZERJPCzh-ja2015/08/25 11:42:027590.5571300.5571300.5571300.0000000.0000000.0000000.0000000.0000000.0000000.000000RBMTNoRBMT A (2015)
55ORGANIZERJPCzh-ja2015/08/25 11:53:507600.5021000.5021000.5021000.0000000.0000000.0000000.0000000.0000000.0000000.000000RBMTNoRBMT B
56ryanJPCzh-ja2019/07/25 22:12:2629540.0000000.0000000.000000-------NMTNoBase Transformer
57sarahJPCzh-ja2019/07/26 11:32:2829760.0000000.0000000.000000-------NMTNoTransformer, ensemble of 4 models
58KNU_HyundaiJPCzh-ja2019/07/27 08:29:2331530.0000000.0000000.000000-------NMTYesTransformer(base) + *Used ASPEC corpus* with relative position, bt, multi source, r2l rerank, 5-model ensemble
59goku20JPCzh-ja2020/09/21 11:54:2240870.0000000.0000000.000000-------NMTNomBART pre-training transformer, single model
60goku20JPCzh-ja2020/09/22 00:04:2641050.0000000.0000000.000000-------NMTNomBART pre-training transformer, ensemble of 3 models
61sakuraJPCzh-ja2024/08/08 19:35:5672640.0000000.0000000.000000-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
62sakuraJPCzh-ja2024/08/09 00:24:0872960.0000000.0000000.000000-------NMTNoLLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1sarahJPCzh-ja2019/07/26 11:32:282976UnderwayNMTNoTransformer, ensemble of 4 models
2KNU_HyundaiJPCzh-ja2019/07/27 08:29:233153UnderwayNMTYesTransformer(base) + *Used ASPEC corpus* with relative position, bt, multi source, r2l rerank, 5-model ensemble

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1JAPIOJPCzh-ja2017/07/26 14:21:18148480.250NMTYesCombination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
2EHRJPCzh-ja2017/07/19 20:41:27141469.750NMTNoSMT reranked NMT (character based, by Moses and OpenNMT)
3EHRJPCzh-ja2017/07/19 19:28:31140868.250NMTNoSMT reranked NMT (word based, by Moses and OpenNMT)
4JAPIOJPCzh-ja2017/07/25 12:22:07144760.500SMTYesPhrase-based SMT with Preordering + Japio corpus
5u-tkbJPCzh-ja2017/07/26 12:44:18146855.500NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1ORGANIZERJPCzh-ja2016/11/16 11:19:58134154.250NMTYesOnline A (2016/11/14)
2NTTJPCzh-ja2016/08/19 08:55:20120046.500NMTNoNMT with pre-ordering and attention over bidirectional LSTMs (pre-ordering module is the same as the PBMT submission)
3JAPIOJPCzh-ja2016/08/19 08:26:57119246.250SMTYesPhrase-based SMT with Preordering + JAPIO corpus
4JAPIOJPCzh-ja2016/08/18 14:15:46118043.500SMTYesPhrase-based SMT with Preordering + JAPIO corpus
5NICT-2JPCzh-ja2016/08/05 18:06:47110043.250SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
6NTTJPCzh-ja2016/08/19 08:28:00119339.250SMTNoPBMT with pre-ordering on dependency structures
7EHRJPCzh-ja2016/07/18 15:25:53100739.000SMTYesCombination of word-based PBSMT and character-based PBSMT with DL=6.
8NICT-2JPCzh-ja2016/08/04 17:34:38107936.750SMTNoPhrase-based SMT with Preordering + Domain Adaptation
9EHRJPCzh-ja2016/07/18 15:33:03100935.500SMT and RBMTYesCombination of word-based PBSMT, character-based PBSMT and RBMT+PBSPE with DL=6.
10bjtu_nlpJPCzh-ja2016/08/09 18:44:56112832.250NMTNoRNN Encoder-Decoder with attention mechanism, single model
11ORGANIZERJPCzh-ja2016/07/26 11:18:451040-19.750OtherYesOnline A (2016)

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-UJPCzh-ja2015/09/02 09:25:0486427.500EBMTNoKyotoEBMT system with bilingual RNNLM reranking (only character-base model)
2TOSHIBAJPCzh-ja2015/07/28 16:30:4152624.250SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
3EHRJPCzh-ja2015/08/17 14:05:2067122.000SMT and RBMTYesSystem combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering. Candidate selection by language model score.
4ORGANIZERJPCzh-ja2015/05/14 18:00:1643220.750SMTNoTree-to-String SMT (2015)
5NTTJPCzh-ja2015/08/21 08:07:1873616.250SMTNoA pre-ordering-based PBMT with patent-tuned dependency parsing and phrase table smoothing.
6TOSHIBAJPCzh-ja2015/07/23 14:43:3050414.500SMT and RBMTYesCombination of phrase-based SMT and SPE systems.
7Kyoto-UJPCzh-ja2015/08/26 13:10:4478114.500EBMTNoBaseline w/o reranking
8EHRJPCzh-ja2015/08/30 12:42:528288.250SMT and RBMTYesRBMT with user dictionary plus SPE
9NTTJPCzh-ja2015/08/28 09:53:248118.000SMTNoA pre-ordering-based PBMT with patent-tuned dependency parsing, learning-based pre-ordering, and phrase table smoothing.
10ORGANIZERJPCzh-ja2015/08/14 16:52:02647-7.000OtherYesOnline A (2015)
11WASUIPSJPCzh-ja2015/09/01 14:16:16853-12.000SMTNoCombining sampling-based alignment and bilingual hierarchical sub-sentential alignment methods.
12ORGANIZERJPCzh-ja2015/08/25 11:42:02759-39.250RBMTNoRBMT A (2015)

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02