NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1Kyoto-U+ECNUja-zh2020/09/17 18:43:013814-38.66--38.5638.43----NMTYesensemble 8 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder)
2Kyoto-U+ECNUja-zh2020/09/19 16:56:524053-38.52--38.4338.30----NMTNowithout out-of-domain parallel data; others same as DataID:3814
3srcbja-zh2019/07/27 15:34:313208-38.63--38.3438.29----NMTNoTransformer with num_units=768, relative position, sentence-wise smooth, encoding side word drop, norm-based batch filtering, residual connection norm, ensemble of 8 models.
4srcbja-zh2019/07/25 11:30:582916-37.72--37.5237.44----NMTNoTransformer with relative position, sentence-wise smooth, encoder side word drop.
5srcbja-zh2018/09/16 14:47:092473-37.60--37.3437.35-- 0.00 0.00NMTNoTransformer with relative position, ensemble of 10 models.
6Kyoto-U+ECNUja-zh2020/09/10 23:54:573676-36.65--36.5536.40----NMTNoforward-translation by using ja monolingual data from ASPEC-JE; lightconv (pay less attention) single model without ensemble
7KNU_Hyundaija-zh2019/07/27 08:54:033170-36.40--36.4636.29----NMTYesTransformer(base) + *Used JPC corpus* with relative position, bt, r2l rerank, 4-model ensemble
8NICT-5ja-zh2018/09/10 14:09:182266-35.99--35.8935.87-- 0.00 0.00NMTNoMLNMT
9NICT-5ja-zh2018/08/27 15:00:252175-35.71--35.6735.55-- 0.00 0.00NMTNoTransformer vanilla model
10Kyoto-Uja-zh2017/08/01 14:17:431722-35.31--35.3735.06- 0.00 0.00 0.00NMTNoEnsemble of 5 shared BPE, averaged
11NICT-5ja-zh2018/08/22 18:56:022055-35.00--35.3534.94-- 0.00 0.00NMTNoMulti-layer-softmax for vanilla transformer. Train 6-layer model. Decode only using 3 layers. 2x faster than 6 layers.
12srcbja-zh2018/08/26 11:37:122153-35.55--35.3235.28-- 0.00 0.00NMTNoTransformer, average checkpoints.
13Kyoto-Uja-zh2017/07/31 15:24:481642-35.67--35.3035.40- 0.00 0.00 0.00NMTNoKW replacement without KW in the test set, BPE, 6 ensemble
14NICT-2ja-zh2017/07/26 14:11:421483-35.23--35.2335.14- 0.00 0.00 0.00NMTNoNMT 6 Ensembles * Bi-directional Reranking
15ORGANIZERja-zh2017/08/02 01:06:051738-34.97--34.9634.72- 0.00 0.00 0.00NMTNoGoogle's "Attention Is All You Need"
16NICT-2ja-zh2017/07/26 14:00:261478-33.72--33.6433.60- 0.00 0.00 0.00NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
17ORGANIZERja-zh2018/08/14 11:39:111903-33.26--33.3333.14-- 0.00 0.00NMTNoNMT with Attention
18Kyoto-Uja-zh2016/08/02 01:25:111071-31.98--32.0831.72- 0.00 0.00 0.00NMTNo2 layer lstm dropout 0.5 200k source voc unk replaced
19NAISTja-zh2015/08/31 15:35:36838-31.61--31.5931.42 0.00 0.00 0.00 0.00SMTNoTravatar System with NeuralMT Reranking
20Kyoto-Uja-zh2015/08/27 13:51:08793-31.40--31.2631.23 0.00 0.00 0.00 0.00EBMTNoKyotoEBMT system with bilingual RNNLM reranking
21bjtu_nlpja-zh2016/08/09 14:48:191120-30.57--30.4930.31- 0.00 0.00 0.00NMTNoRNN Encoder-Decoder with attention mechanism, single model
22NAISTja-zh2014/08/01 17:18:51122-30.53--30.4630.25 0.00 0.00 0.00 0.00SMTNoTravatar-based Forest-to-String SMT System
23TOSHIBAja-zh2015/07/23 14:49:40505-30.17--30.1529.89 0.00 0.00 0.00 0.00SMT and RBMTYesSPE(Statistical Post Editing) System
24TOSHIBAja-zh2015/08/17 16:29:35676-30.07--30.1429.83 0.00 0.00 0.00 0.00SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model + post-processing
25Kyoto-Uja-zh2015/07/31 00:35:46545-30.19--29.9829.90 0.00 0.00 0.00 0.00EBMTNoadded one reordering feature, w/ reranking
26NICT-2ja-zh2016/08/05 18:09:191105-30.00--29.9729.78- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC)
27Kyoto-Uja-zh2015/07/03 11:01:45457-30.08--29.9429.87 0.00 0.00 0.00 0.00EBMTNoKyoto-U team WAT2015 baseline with reranking
28Kyoto-Uja-zh2016/08/05 23:26:201109-30.27--29.9429.92- 0.00 0.00 0.00EBMTNoKyotoEBMT 2016 w/o reranking
29NAISTja-zh2015/08/31 15:38:17839-30.06--29.9229.73 0.00 0.00 0.00 0.00SMTNoTravatar System Baseline
30NAISTja-zh2014/08/01 17:27:20123-29.83--29.7729.54 0.00 0.00 0.00 0.00SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
31Kyoto-Uja-zh2015/08/26 02:17:25778-29.99--29.7629.81 0.00 0.00 0.00 0.00EBMTNoKyotoEBMT system without reranking
32Kyoto-Uja-zh2015/07/03 11:09:12458-29.18--29.0028.94 0.00 0.00 0.00 0.00EBMTNoKyoto-U team WAT2015 baseline
33ORGANIZERja-zh2014/07/11 20:00:2810-28.65--28.6528.35 0.00 0.00 0.00 0.00SMTNoString-to-Tree SMT (2014)
34ORGANIZERja-zh2015/09/10 14:12:41881-28.65--28.6528.35 0.00 0.00 0.00 0.00SMTNoString-to-Tree SMT (2015)
35NICTja-zh2014/09/01 09:23:36260-27.98--28.1827.84 0.00 0.00 0.00 0.00SMTNoPre-reordering for phrase-based SMT (dependency parsing + manual rules)
36ORGANIZERja-zh2014/07/11 19:50:507-27.96--28.0127.68 0.00 0.00 0.00 0.00SMTNoPhrase-based SMT
37Kyoto-Uja-zh2015/08/25 12:51:38765-28.05--27.8427.88 0.00 0.00 0.00 0.00EBMTNoescaping w/ reranking
38ORGANIZERja-zh2014/07/11 19:45:543-27.71--27.7027.35 0.00 0.00 0.00 0.00SMTNoHierarchical Phrase-based SMT (2014)
39Kyoto-Uja-zh2014/09/01 08:21:59259-27.67--27.4427.34 0.00 0.00 0.00 0.00EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
40Kyoto-Uja-zh2014/08/31 23:38:07257-27.21--27.0226.83 0.00 0.00 0.00 0.00EBMTNoOur new baseline system after several modifications.
41TOSHIBAja-zh2014/08/29 18:06:20238-27.42--26.8226.79 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
42Kyoto-Uja-zh2014/07/14 14:30:3918-26.69--26.4826.30 0.00 0.00 0.00 0.00EBMTNoOur baseline system.
43WASUIPSja-zh2014/09/17 12:07:07390-25.63--25.3025.18 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
44WASUIPSja-zh2014/09/17 01:11:02377-25.60--25.1025.07 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 2.1.1).
45WASUIPSja-zh2014/09/17 10:29:24385-25.45--25.1025.01 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: kytea, moses: 2.1.1).
46WASUIPSja-zh2014/09/17 01:08:33376-25.44--25.0424.98 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 2.1.1).
47WASUIPSja-zh2014/09/17 10:32:13386-25.68--25.0125.11 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 2.1.1).
48WASUIPSja-zh2014/09/17 12:04:30389-25.08--24.8124.64 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
49WASUIPSja-zh2014/09/17 00:54:35373-24.70--24.2524.28 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 1.0).
50BJTUNLPja-zh2014/08/28 20:02:56224-24.12--23.7623.55 0.00 0.00 0.00 0.00SMTNo
51Senseja-zh2014/08/26 15:19:02201-23.09--22.9423.04 0.00 0.00 0.00 0.00SMTNoCharacter based SMT
52TMUja-zh2017/08/03 01:02:471743-22.92--22.8622.74- 0.00 0.00 0.00NMTNoJP-CN reconstructor baseline
53WASUIPSja-zh2014/09/17 00:47:46371-22.71--22.4922.39 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 1.0).
54WASUIPSja-zh2014/09/17 10:17:52382-22.20--22.0221.91 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 1.0).
55WASUIPSja-zh2014/09/17 10:15:13381-22.01--21.8121.61 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: kytea, moses: 1.0).
56TOSHIBAja-zh2014/08/29 17:59:06236-19.28--18.9318.82 0.00 0.00 0.00 0.00RBMTYesRBMT system
57ORGANIZERja-zh2014/08/29 18:51:05243-17.86--17.7517.49 0.00 0.00 0.00 0.00RBMTNoRBMT B (2014)
58ORGANIZERja-zh2015/09/10 14:32:38886-17.86--17.7517.49 0.00 0.00 0.00 0.00OtherYesRBMT B (2015)
59ORGANIZERja-zh2016/11/16 10:58:301336-15.94--15.6815.38- 0.00 0.00 0.00NMTYesOnline D (2016/11/14)
60ORGANIZERja-zh2016/07/26 12:18:341045-11.16--10.7210.54- 0.00 0.00 0.00OtherYesOnline D (2016)
61ORGANIZERja-zh2015/08/25 18:59:20777-10.73--10.3310.08 0.00 0.00 0.00 0.00OtherYesOnline D (2015)
62ORGANIZERja-zh2014/08/29 18:53:46244- 9.62-- 9.96 9.59 0.00 0.00 0.00 0.00RBMTNoRBMT C
63ORGANIZERja-zh2014/07/18 11:10:3737- 9.37-- 8.93 8.84 0.00 0.00 0.00 0.00OtherYesOnline D (2014)
64TMUja-zh2018/09/19 10:58:572505- 7.73-- 7.52 7.22-- 0.00 0.00NMTYesUnsupervised NMT using Sub-character level information. JPO patent data was used as monolingual data in the training process.
65ORGANIZERja-zh2015/09/11 10:11:23891- 7.44-- 7.05 6.75 0.00 0.00 0.00 0.00OtherYesOnline C (2015)
66ORGANIZERja-zh2014/08/28 12:11:11216- 7.26-- 7.01 6.72 0.00 0.00 0.00 0.00OtherYesOnline C (2014)

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1srcbja-zh2019/07/25 11:30:582916-0.860189--0.8592370.859522----NMTNoTransformer with relative position, sentence-wise smooth, encoder side word drop.
2srcbja-zh2018/09/16 14:47:092473-0.859132--0.8580420.858162--0.0000000.000000NMTNoTransformer with relative position, ensemble of 10 models.
3Kyoto-U+ECNUja-zh2020/09/17 18:43:013814-0.858491--0.8576450.858103----NMTYesensemble 8 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder)
4Kyoto-U+ECNUja-zh2020/09/19 16:56:524053-0.858229--0.8572290.857722----NMTNowithout out-of-domain parallel data; others same as DataID:3814
5srcbja-zh2019/07/27 15:34:313208-0.858506--0.8568640.857121----NMTNoTransformer with num_units=768, relative position, sentence-wise smooth, encoding side word drop, norm-based batch filtering, residual connection norm, ensemble of 8 models.
6KNU_Hyundaija-zh2019/07/27 08:54:033170-0.854030--0.8540850.854391----NMTYesTransformer(base) + *Used JPC corpus* with relative position, bt, r2l rerank, 4-model ensemble
7Kyoto-U+ECNUja-zh2020/09/10 23:54:573676-0.853793--0.8527310.852979----NMTNoforward-translation by using ja monolingual data from ASPEC-JE; lightconv (pay less attention) single model without ensemble
8NICT-2ja-zh2017/07/26 14:11:421483-0.852084--0.8518930.851548-0.0000000.0000000.000000NMTNoNMT 6 Ensembles * Bi-directional Reranking
9NICT-5ja-zh2018/08/22 18:56:022055-0.851083--0.8516700.850222--0.0000000.000000NMTNoMulti-layer-softmax for vanilla transformer. Train 6-layer model. Decode only using 3 layers. 2x faster than 6 layers.
10NICT-5ja-zh2018/09/10 14:09:182266-0.851382--0.8514160.850944--0.0000000.000000NMTNoMLNMT
11srcbja-zh2018/08/26 11:37:122153-0.851766--0.8509680.851032--0.0000000.000000NMTNoTransformer, average checkpoints.
12NICT-5ja-zh2018/08/27 15:00:252175-0.851890--0.8506990.850580--0.0000000.000000NMTNoTransformer vanilla model
13ORGANIZERja-zh2017/08/02 01:06:051738-0.850199--0.8500520.848394-0.0000000.0000000.000000NMTNoGoogle's "Attention Is All You Need"
14Kyoto-Uja-zh2017/08/01 14:17:431722-0.850103--0.8491680.847879-0.0000000.0000000.000000NMTNoEnsemble of 5 shared BPE, averaged
15Kyoto-Uja-zh2017/07/31 15:24:481642-0.849464--0.8481070.848318-0.0000000.0000000.000000NMTNoKW replacement without KW in the test set, BPE, 6 ensemble
16NICT-2ja-zh2017/07/26 14:00:261478-0.847223--0.8465780.846158-0.0000000.0000000.000000NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
17ORGANIZERja-zh2018/08/14 11:39:111903-0.844322--0.8445720.844959--0.0000000.000000NMTNoNMT with Attention
18Kyoto-Uja-zh2016/08/02 01:25:111071-0.837579--0.8393540.835932-0.0000000.0000000.000000NMTNo2 layer lstm dropout 0.5 200k source voc unk replaced
19NAISTja-zh2015/08/31 15:35:36838-0.832765--0.8342450.8337210.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
20NAISTja-zh2014/08/01 17:27:20123-0.829627--0.8308390.8305290.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
21bjtu_nlpja-zh2016/08/09 14:48:191120-0.829679--0.8291130.827637-0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
22Kyoto-Uja-zh2015/08/27 13:51:08793-0.826986--0.8269190.8271900.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
23NICT-2ja-zh2016/08/05 18:09:191105-0.820891--0.8200690.821090-0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC)
24NAISTja-zh2014/08/01 17:18:51122-0.818040--0.8194060.8194920.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
25TOSHIBAja-zh2015/08/17 16:29:35676-0.817294--0.8169840.8169810.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model + post-processing
26NAISTja-zh2015/08/31 15:38:17839-0.815084--0.8166240.8164620.0000000.0000000.0000000.000000SMTNoTravatar System Baseline
27Kyoto-Uja-zh2016/08/05 23:26:201109-0.813114--0.8135810.813054-0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
28TOSHIBAja-zh2015/07/23 14:49:40505-0.813490--0.8132330.8134410.0000000.0000000.0000000.000000SMT and RBMTYesSPE(Statistical Post Editing) System
29Kyoto-Uja-zh2015/07/31 00:35:46545-0.810674--0.8123720.8113160.0000000.0000000.0000000.000000EBMTNoadded one reordering feature, w/ reranking
30ORGANIZERja-zh2014/07/11 19:45:543-0.809128--0.8095610.8113940.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
31ORGANIZERja-zh2014/07/11 20:00:2810-0.807606--0.8094570.8084170.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2014)
32ORGANIZERja-zh2015/09/10 14:12:41881-0.807606--0.8094570.8084170.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2015)
33NICTja-zh2014/09/01 09:23:36260-0.806070--0.8086840.8078090.0000000.0000000.0000000.000000SMTNoPre-reordering for phrase-based SMT (dependency parsing + manual rules)
34Kyoto-Uja-zh2015/08/26 02:17:25778-0.807083--0.8082750.8080100.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
35Kyoto-Uja-zh2015/07/03 11:01:45457-0.806771--0.8075960.8074320.0000000.0000000.0000000.000000EBMTNoKyoto-U team WAT2015 baseline with reranking
36TOSHIBAja-zh2014/08/29 18:06:20238-0.804444--0.8033020.8039800.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
37Kyoto-Uja-zh2015/08/25 12:51:38765-0.799725--0.8000320.8009880.0000000.0000000.0000000.000000EBMTNoescaping w/ reranking
38Kyoto-Uja-zh2015/07/03 11:09:12458-0.798663--0.7998640.7987480.0000000.0000000.0000000.000000EBMTNoKyoto-U team WAT2015 baseline
39TMUja-zh2017/08/03 01:02:471743-0.798681--0.7987360.797969-0.0000000.0000000.000000NMTNoJP-CN reconstructor baseline
40Kyoto-Uja-zh2014/07/14 14:30:3918-0.796402--0.7980840.7983830.0000000.0000000.0000000.000000EBMTNoOur baseline system.
41BJTUNLPja-zh2014/08/28 20:02:56224-0.794834--0.7961860.7930540.0000000.0000000.0000000.000000SMTNo
42WASUIPSja-zh2014/09/17 01:11:02377-0.794716--0.7957860.7955940.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 2.1.1).
43WASUIPSja-zh2014/09/17 10:32:13386-0.795721--0.7955040.7951290.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 2.1.1).
44WASUIPSja-zh2014/09/17 12:07:07390-0.794646--0.7953070.7940240.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
45WASUIPSja-zh2014/09/17 01:08:33376-0.794244--0.7939450.7948230.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 2.1.1).
46WASUIPSja-zh2014/09/17 10:29:24385-0.793819--0.7933080.7930290.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: kytea, moses: 2.1.1).
47Kyoto-Uja-zh2014/08/31 23:38:07257-0.791270--0.7921660.7907430.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
48WASUIPSja-zh2014/09/17 12:04:30389-0.790498--0.7914300.7901420.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
49WASUIPSja-zh2014/09/17 00:54:35373-0.790030--0.7904600.7908980.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 1.0).
50ORGANIZERja-zh2014/07/11 19:50:507-0.788961--0.7902630.7909370.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
51Kyoto-Uja-zh2014/09/01 08:21:59259-0.788321--0.7890690.7882060.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
52Senseja-zh2014/08/26 15:19:02201-0.779495--0.7795020.7802620.0000000.0000000.0000000.000000SMTNoCharacter based SMT
53WASUIPSja-zh2014/09/17 00:47:46371-0.776323--0.7776150.7773270.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 1.0).
54WASUIPSja-zh2014/09/17 10:17:52382-0.771952--0.7733410.7721070.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 1.0).
55WASUIPSja-zh2014/09/17 10:15:13381-0.767418--0.7674140.7660920.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: kytea, moses: 1.0).
56TOSHIBAja-zh2014/08/29 17:59:06236-0.764491--0.7653460.7639310.0000000.0000000.0000000.000000RBMTYesRBMT system
57ORGANIZERja-zh2014/08/29 18:51:05243-0.744818--0.7458850.7437940.0000000.0000000.0000000.000000RBMTNoRBMT B (2014)
58ORGANIZERja-zh2015/09/10 14:32:38886-0.744818--0.7458850.7437940.0000000.0000000.0000000.000000OtherYesRBMT B (2015)
59ORGANIZERja-zh2016/11/16 10:58:301336-0.728453--0.7282700.728284-0.0000000.0000000.000000NMTYesOnline D (2016/11/14)
60ORGANIZERja-zh2016/07/26 12:18:341045-0.665185--0.6673820.666953-0.0000000.0000000.000000OtherYesOnline D (2016)
61ORGANIZERja-zh2015/08/25 18:59:20777-0.660484--0.6608470.6604820.0000000.0000000.0000000.000000OtherYesOnline D (2015)
62ORGANIZERja-zh2014/08/29 18:53:46244-0.642278--0.6487580.6453850.0000000.0000000.0000000.000000RBMTNoRBMT C
63TMUja-zh2018/09/19 10:58:572505-0.621413--0.6232920.622094--0.0000000.000000NMTYesUnsupervised NMT using Sub-character level information. JPO patent data was used as monolingual data in the training process.
64ORGANIZERja-zh2015/09/11 10:11:23891-0.611964--0.6150480.6121580.0000000.0000000.0000000.000000OtherYesOnline C (2015)
65ORGANIZERja-zh2014/08/28 12:11:11216-0.612808--0.6130750.6115630.0000000.0000000.0000000.000000OtherYesOnline C (2014)
66ORGANIZERja-zh2014/07/18 11:10:3737-0.606905--0.6063280.6041490.0000000.0000000.0000000.000000OtherYesOnline D (2014)

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1srcbja-zh2018/09/16 14:47:092473-0.791120--0.7911200.791120--0.0000000.000000NMTNoTransformer with relative position, ensemble of 10 models.
2Kyoto-U+ECNUja-zh2020/09/17 18:43:013814-0.787730--0.7877300.787730----NMTYesensemble 8 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder)
3srcbja-zh2018/08/26 11:37:122153-0.787570--0.7875700.787570--0.0000000.000000NMTNoTransformer, average checkpoints.
4ORGANIZERja-zh2017/08/02 01:06:051738-0.787250--0.7872500.787250-0.0000000.0000000.000000NMTNoGoogle's "Attention Is All You Need"
5srcbja-zh2019/07/27 15:34:313208-0.787220--0.7872200.787220----NMTNoTransformer with num_units=768, relative position, sentence-wise smooth, encoding side word drop, norm-based batch filtering, residual connection norm, ensemble of 8 models.
6Kyoto-U+ECNUja-zh2020/09/19 16:56:524053-0.786870--0.7868700.786870----NMTNowithout out-of-domain parallel data; others same as DataID:3814
7srcbja-zh2019/07/25 11:30:582916-0.786600--0.7866000.786600----NMTNoTransformer with relative position, sentence-wise smooth, encoder side word drop.
8NICT-2ja-zh2017/07/26 14:11:421483-0.785820--0.7858200.785820-0.0000000.0000000.000000NMTNoNMT 6 Ensembles * Bi-directional Reranking
9NICT-5ja-zh2018/08/27 15:00:252175-0.785440--0.7854400.785440--0.0000000.000000NMTNoTransformer vanilla model
10Kyoto-Uja-zh2017/08/01 14:17:431722-0.785420--0.7854200.785420-0.0000000.0000000.000000NMTNoEnsemble of 5 shared BPE, averaged
11NICT-5ja-zh2018/08/22 18:56:022055-0.784340--0.7843400.784340--0.0000000.000000NMTNoMulti-layer-softmax for vanilla transformer. Train 6-layer model. Decode only using 3 layers. 2x faster than 6 layers.
12Kyoto-U+ECNUja-zh2020/09/10 23:54:573676-0.783970--0.7839700.783970----NMTNoforward-translation by using ja monolingual data from ASPEC-JE; lightconv (pay less attention) single model without ensemble
13NICT-5ja-zh2018/09/10 14:09:182266-0.781410--0.7814100.781410--0.0000000.000000NMTNoMLNMT
14KNU_Hyundaija-zh2019/07/27 08:54:033170-0.781350--0.7813500.781350----NMTYesTransformer(base) + *Used JPC corpus* with relative position, bt, r2l rerank, 4-model ensemble
15NICT-2ja-zh2017/07/26 14:00:261478-0.779870--0.7798700.779870-0.0000000.0000000.000000NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
16Kyoto-Uja-zh2017/07/31 15:24:481642-0.779400--0.7794000.779400-0.0000000.0000000.000000NMTNoKW replacement without KW in the test set, BPE, 6 ensemble
17ORGANIZERja-zh2018/08/14 11:39:111903-0.777600--0.7776000.777600--0.0000000.000000NMTNoNMT with Attention
18Kyoto-Uja-zh2015/08/27 13:51:087930.0000000.7684700.0000000.0000000.7684700.7684700.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
19Kyoto-Uja-zh2015/08/26 02:17:257780.0000000.7654400.0000000.0000000.7654400.7654400.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
20Kyoto-Uja-zh2015/07/03 11:01:454570.0000000.7653200.0000000.0000000.7653200.7653200.0000000.0000000.0000000.000000EBMTNoKyoto-U team WAT2015 baseline with reranking
21Kyoto-Uja-zh2015/07/03 11:09:124580.0000000.7645300.0000000.0000000.7645300.7645300.0000000.0000000.0000000.000000EBMTNoKyoto-U team WAT2015 baseline
22Kyoto-Uja-zh2016/08/05 23:26:201109-0.764230--0.7642300.764230-0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
23NAISTja-zh2015/08/31 15:35:368380.0000000.7633900.0000000.0000000.7633900.7633900.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
24Kyoto-Uja-zh2016/08/02 01:25:111071-0.763290--0.7632900.763290-0.0000000.0000000.000000NMTNo2 layer lstm dropout 0.5 200k source voc unk replaced
25Kyoto-Uja-zh2015/07/31 00:35:465450.0000000.7630200.0000000.0000000.7630200.7630200.0000000.0000000.0000000.000000EBMTNoadded one reordering feature, w/ reranking
26TOSHIBAja-zh2015/08/17 16:29:356760.0000000.7625200.0000000.0000000.7625200.7625200.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model + post-processing
27TOSHIBAja-zh2015/07/23 14:49:405050.0000000.7620600.0000000.0000000.7620600.7620600.0000000.0000000.0000000.000000SMT and RBMTYesSPE(Statistical Post Editing) System
28NAISTja-zh2014/08/01 17:18:511220.0000000.7597400.0000000.0000000.7597400.7597400.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
29NICT-2ja-zh2016/08/05 18:09:191105-0.759670--0.7596700.759670-0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC)
30NAISTja-zh2014/08/01 17:27:201230.0000000.7584800.0000000.0000000.7584800.7584800.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
31Kyoto-Uja-zh2015/08/25 12:51:387650.0000000.7574400.0000000.0000000.7574400.7574400.0000000.0000000.0000000.000000EBMTNoescaping w/ reranking
32NAISTja-zh2015/08/31 15:38:178390.0000000.7569900.0000000.0000000.7569900.7569900.0000000.0000000.0000000.000000SMTNoTravatar System Baseline
33ORGANIZERja-zh2014/07/11 20:00:28100.0000000.7552300.0000000.0000000.7552300.7552300.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2014)
34ORGANIZERja-zh2015/09/10 14:12:418810.0000000.7552300.0000000.0000000.7552300.7552300.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2015)
35bjtu_nlpja-zh2016/08/09 14:48:191120-0.754690--0.7546900.754690-0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
36Kyoto-Uja-zh2014/08/31 23:38:072570.0000000.7540500.0000000.0000000.7540500.7540500.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
37Kyoto-Uja-zh2014/09/01 08:21:592590.0000000.7517400.0000000.0000000.7517400.7517400.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
38WASUIPSja-zh2014/09/17 01:08:333760.0000000.7502400.0000000.0000000.7502400.7502400.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 2.1.1).
39WASUIPSja-zh2014/09/17 01:11:023770.0000000.7502200.0000000.0000000.7502200.7502200.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 2.1.1).
40WASUIPSja-zh2014/09/17 10:29:243850.0000000.7494700.0000000.0000000.7494700.7494700.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: kytea, moses: 2.1.1).
41ORGANIZERja-zh2014/07/11 19:50:5070.0000000.7494500.0000000.0000000.7494500.7494500.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
42WASUIPSja-zh2014/09/17 10:32:133860.0000000.7483600.0000000.0000000.7483600.7483600.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 2.1.1).
43WASUIPSja-zh2014/09/17 12:07:073900.0000000.7478900.0000000.0000000.7478900.7478900.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
44Kyoto-Uja-zh2014/07/14 14:30:39180.0000000.7470900.0000000.0000000.7470900.7470900.0000000.0000000.0000000.000000EBMTNoOur baseline system.
45Senseja-zh2014/08/26 15:19:022010.0000000.7467500.0000000.0000000.7467500.7467500.0000000.0000000.0000000.000000SMTNoCharacter based SMT
46TOSHIBAja-zh2014/08/29 18:06:202380.0000000.7460000.0000000.0000000.7460000.7460000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
47NICTja-zh2014/09/01 09:23:362600.0000000.7459800.0000000.0000000.7459800.7459800.0000000.0000000.0000000.000000SMTNoPre-reordering for phrase-based SMT (dependency parsing + manual rules)
48ORGANIZERja-zh2014/07/11 19:45:5430.0000000.7451000.0000000.0000000.7451000.7451000.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
49WASUIPSja-zh2014/09/17 00:54:353730.0000000.7441500.0000000.0000000.7441500.7441500.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 1.0).
50WASUIPSja-zh2014/09/17 12:04:303890.0000000.7414900.0000000.0000000.7414900.7414900.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
51WASUIPSja-zh2014/09/17 00:47:463710.0000000.7286500.0000000.0000000.7286500.7286500.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 1.0).
52WASUIPSja-zh2014/09/17 10:15:133810.0000000.7279200.0000000.0000000.7279200.7279200.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: kytea, moses: 1.0).
53BJTUNLPja-zh2014/08/28 20:02:562240.0000000.7277000.0000000.0000000.7277000.7277000.0000000.0000000.0000000.000000SMTNo
54WASUIPSja-zh2014/09/17 10:17:523820.0000000.7255000.0000000.0000000.7255000.7255000.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 1.0).
55TMUja-zh2017/08/03 01:02:471743-0.700030--0.7000300.700030-0.0000000.0000000.000000NMTNoJP-CN reconstructor baseline
56TOSHIBAja-zh2014/08/29 17:59:062360.0000000.6853800.0000000.0000000.6853800.6853800.0000000.0000000.0000000.000000RBMTYesRBMT system
57ORGANIZERja-zh2016/11/16 10:58:301336-0.673730--0.6737300.673730-0.0000000.0000000.000000NMTYesOnline D (2016/11/14)
58ORGANIZERja-zh2014/08/29 18:51:052430.0000000.6679600.0000000.0000000.6679600.6679600.0000000.0000000.0000000.000000RBMTNoRBMT B (2014)
59ORGANIZERja-zh2015/09/10 14:32:388860.0000000.6679600.0000000.0000000.6679600.6679600.0000000.0000000.0000000.000000OtherYesRBMT B (2015)
60ORGANIZERja-zh2016/07/26 12:18:341045-0.639440--0.6394400.639440-0.0000000.0000000.000000OtherYesOnline D (2016)
61ORGANIZERja-zh2015/08/25 18:59:207770.0000000.6340900.0000000.0000000.6340900.6340900.0000000.0000000.0000000.000000OtherYesOnline D (2015)
62ORGANIZERja-zh2014/07/18 11:10:37370.0000000.6254300.0000000.0000000.6254300.6254300.0000000.0000000.0000000.000000OtherYesOnline D (2014)
63ORGANIZERja-zh2014/08/29 18:53:462440.0000000.5949000.0000000.0000000.5949000.5949000.0000000.0000000.0000000.000000RBMTNoRBMT C
64ORGANIZERja-zh2014/08/28 12:11:112160.0000000.5878200.0000000.0000000.5878200.5878200.0000000.0000000.0000000.000000OtherYesOnline C (2014)
65ORGANIZERja-zh2015/09/11 10:11:238910.0000000.5660600.0000000.0000000.5660600.5660600.0000000.0000000.0000000.000000OtherYesOnline C (2015)
66TMUja-zh2018/09/19 10:58:572505-0.545630--0.5456300.545630--0.0000000.000000NMTYesUnsupervised NMT using Sub-character level information. JPO patent data was used as monolingual data in the training process.

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-U+ECNUja-zh2020/09/17 18:43:0138144.180NMTYesensemble 8 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder)
2Kyoto-U+ECNUja-zh2020/09/19 16:56:5240534.170NMTNowithout out-of-domain parallel data; others same as DataID:3814

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1KNU_Hyundaija-zh2019/07/27 08:54:033170UnderwayNMTYesTransformer(base) + *Used JPC corpus* with relative position, bt, r2l rerank, 4-model ensemble
2srcbja-zh2019/07/27 15:34:313208UnderwayNMTNoTransformer with num_units=768, relative position, sentence-wise smooth, encoding side word drop, norm-based batch filtering, residual connection norm, ensemble of 8 models.

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1srcbja-zh2018/09/16 14:47:09247314.000NMTNoTransformer with relative position, ensemble of 10 models.
2NICT-5ja-zh2018/09/10 14:09:1822667.000NMTNoMLNMT
3NICT-5ja-zh2018/08/27 15:00:2521755.250NMTNoTransformer vanilla model

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-Uja-zh2017/08/01 14:17:43172272.500NMTNoEnsemble of 5 shared BPE, averaged
2Kyoto-Uja-zh2017/07/31 15:24:48164271.500NMTNoKW replacement without KW in the test set, BPE, 6 ensemble
3ORGANIZERja-zh2017/08/02 01:06:05173870.500NMTNoGoogle's "Attention Is All You Need"
4NICT-2ja-zh2017/07/26 14:11:42148369.500NMTNoNMT 6 Ensembles * Bi-directional Reranking
5NICT-2ja-zh2017/07/26 14:00:26147867.250NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
6TMUja-zh2017/08/03 01:02:4717434.250NMTNoJP-CN reconstructor baseline

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-Uja-zh2016/08/02 01:25:11107158.750NMTNo2 layer lstm dropout 0.5 200k source voc unk replaced
2bjtu_nlpja-zh2016/08/09 14:48:19112046.250NMTNoRNN Encoder-Decoder with attention mechanism, single model
3Kyoto-Uja-zh2016/08/05 23:26:20110930.750EBMTNoKyotoEBMT 2016 w/o reranking
4NICT-2ja-zh2016/08/05 18:09:19110524.000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC)
5ORGANIZERja-zh2016/11/16 10:58:30133617.750NMTYesOnline D (2016/11/14)
6ORGANIZERja-zh2016/07/26 12:18:341045-26.000OtherYesOnline D (2016)

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1TOSHIBAja-zh2015/08/17 16:29:3567617.000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model + post-processing
2Kyoto-Uja-zh2015/08/26 02:17:2577816.000EBMTNoKyotoEBMT system without reranking
3Kyoto-Uja-zh2015/08/27 13:51:0879312.500EBMTNoKyotoEBMT system with bilingual RNNLM reranking
4ORGANIZERja-zh2015/09/10 14:12:418817.750SMTNoString-to-Tree SMT (2015)
5NAISTja-zh2015/08/31 15:35:368387.000SMTNoTravatar System with NeuralMT Reranking
6NAISTja-zh2015/08/31 15:38:178392.750SMTNoTravatar System Baseline
7TOSHIBAja-zh2015/07/23 14:49:405052.500SMT and RBMTYesSPE(Statistical Post Editing) System
8ORGANIZERja-zh2015/09/10 14:32:38886-11.000OtherYesRBMT B (2015)
9ORGANIZERja-zh2015/08/25 18:59:20777-14.750OtherYesOnline D (2015)

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NAISTja-zh2014/08/01 17:18:5112217.750SMTNoTravatar-based Forest-to-String SMT System
2ORGANIZERja-zh2014/07/11 20:00:281014.000SMTNoString-to-Tree SMT (2014)
3Senseja-zh2014/08/26 15:19:0220110.000SMTNoCharacter based SMT
4NICTja-zh2014/09/01 09:23:362606.500SMTNoPre-reordering for phrase-based SMT (dependency parsing + manual rules)
5ORGANIZERja-zh2014/07/11 19:45:5433.750SMTNoHierarchical Phrase-based SMT (2014)
6NAISTja-zh2014/08/01 17:27:201231.250SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
7TOSHIBAja-zh2014/08/29 18:06:202380.750SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
8Kyoto-Uja-zh2014/08/31 23:38:07257-0.750EBMTNoOur new baseline system after several modifications.
9BJTUNLPja-zh2014/08/28 20:02:56224-3.750SMTNo
10TOSHIBAja-zh2014/08/29 17:59:06236-5.250RBMTYesRBMT system
11Kyoto-Uja-zh2014/09/01 08:21:59259-8.750EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
12ORGANIZERja-zh2014/07/18 11:10:3737-14.500OtherYesOnline D (2014)
13ORGANIZERja-zh2014/08/29 18:51:05243-20.000RBMTNoRBMT B (2014)

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02