JST_LOGO.JPG NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
1NTTja-en2017/08/01 04:29:021681---28.36---NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
2NTTja-en2017/08/01 15:53:151732---28.15---NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data)
3ORGANIZERja-en2017/08/02 01:03:081736---28.06---NMTNoGoogle's "Attention Is All You Need"
4Kyoto-Uja-en2017/08/01 16:38:211733---27.66---NMTNoEnsemble of 4 BPE, averaged Coverage penalty
5NTTja-en2017/08/01 14:50:511724---27.62---NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data)
6NAISTja-en2016/08/27 00:05:381275---27.55---SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
7Kyoto-Uja-en2017/08/01 13:42:251717---27.53---NMTNoEnsemble of 4 BPE averaged parameters
8NTTja-en2017/07/30 20:43:071616---27.43---NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
9NICT-2ja-en2017/07/26 14:04:381480---26.76---NMTNoNMT 6 Ensembles * Bi-directional Reranking
10NAISTja-en2016/08/09 16:14:051122---26.39---SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
11Kyoto-Uja-en2016/08/18 15:17:051182---26.22---NMTNoEnsemble of 4 single-layer model (30k voc)
12NAISTja-en2016/08/20 15:33:121247---26.12---SMTNoNeural MT w/ Lexicon 6 Ensemble
13NAISTja-en2015/08/14 17:46:43655---25.41-- 0.00SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
14NICT-2ja-en2017/07/26 13:54:281476---24.79---NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
15NAISTja-en2015/08/24 23:53:53757---24.77-- 0.00SMTNoTravatar System with NeuralMT Reranking
16Kyoto-Uja-en2016/08/20 15:07:471246---24.71---NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
17TMUja-en2017/08/04 11:16:361750---24.55---NMTNobeam_size: 10, ensemble of different dropout rates.
18NAISTja-en2014/07/19 01:04:4846---23.82-- 0.00SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
19NAISTja-en2014/08/01 17:35:16125---23.47-- 0.00SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
20CUNIja-en2017/07/31 22:30:521665---23.43---NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
21NAISTja-en2014/07/31 11:40:53119---23.29-- 0.00SMTNoTravatar-based Forest-to-String SMT System
22TMUja-en2017/08/01 11:35:081703---23.03---NMTNobaseline system with beam20
23TOSHIBAja-en2015/07/23 15:00:12506---23.00-- 0.00SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
24TOSHIBAja-en2015/07/28 16:44:27529---22.89-- 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
25Kyoto-Uja-en2015/08/30 13:02:18829---22.89-- 0.00EBMTNoKyotoEBMT system with bilingual RNNLM reranking
26TMUja-en2017/08/01 12:14:411712---22.87---NMTNothe ensemble system of different dropout rate.
27NAISTja-en2015/08/25 13:02:45766---22.62-- 0.00SMTNoTravatar System with Parser Self Training
28NAISTja-en2015/08/25 13:03:48767---22.16-- 0.00SMTNoTravatar System Baseline
29ORGANIZERja-en2016/11/16 10:29:511333---22.04---NMTYesOnline D (2016/11/14)
30NICT-2ja-en2016/08/05 17:50:251104---21.54---SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
31Kyoto-Uja-en2015/08/27 14:40:32796---21.31-- 0.00EBMTNoKyotoEBMT system without reranking
32Kyoto-Uja-en2016/08/19 01:31:011189---21.22---EBMTNoKyotoEBMT 2016 w/o reranking
33Kyoto-Uja-en2014/09/01 10:27:54262---21.07-- 0.00EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
34TMUja-en2017/08/01 11:10:491695---21.00---NMTNoour baseline system in 2017
35TOSHIBAja-en2014/08/29 18:48:24241---20.61-- 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
36Kyoto-Uja-en2014/08/31 23:36:50256---20.60-- 0.00EBMTNoOur new baseline system after several modifications.
37ORGANIZERja-en2014/07/11 19:59:559---20.36-- 0.00SMTNoString-to-Tree SMT (2014)
38ORGANIZERja-en2015/09/10 13:41:03877---20.36-- 0.00SMTNoString-to-Tree SMT (2015)
39Kyoto-Uja-en2014/08/19 10:24:06136---20.02-- 0.00EBMTNoOur baseline system using 3M parallel sentences.
40EIWAja-en2014/07/30 16:07:14116---19.86-- 0.00SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
41NICTja-en2015/07/16 13:27:58488---18.98-- 0.00SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
42NICTja-en2015/07/17 11:02:10492---18.96-- 0.00SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
43Senseja-en2014/08/23 05:34:03164---18.82-- 0.00SMTNoParaphrase max10
44ORGANIZERja-en2014/07/11 19:45:322---18.72-- 0.00SMTNoHierarchical Phrase-based SMT (2014)
45ORGANIZERja-en2014/07/11 19:49:576---18.45-- 0.00SMTNoPhrase-based SMT
46TMUja-en2016/08/20 14:31:481234---18.45---NMTNo 6 ensemble
47bjtu_nlpja-en2016/08/17 19:51:241168---18.34---NMTNoRNN Encoder-Decoder with attention mechanism, single model
48TMUja-en2015/08/04 16:32:20578---18.32-- 0.00SMTNoOur PBSMT baseline (2015)
49TMUja-en2016/08/20 07:39:021222---18.29---NMTNo2016 our proposed method to control output voice
50Senseja-en2015/08/28 19:25:25822---18.20-- 0.00SMTNoBaseline-2015 (train1 only)
51NICTja-en2015/07/17 08:51:45489---18.09-- 0.00SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
52Senseja-en2015/08/29 04:32:36824---18.09-- 0.00SMTNoBaseline-2015 (train123)
53NIIja-en2014/09/02 11:42:01271---17.47-- 0.00SMTNoOur Baseline
54Senseja-en2015/07/28 22:23:43530---17.04-- 0.00SMTNoBaseline-2015
55NIIja-en2014/09/02 11:42:53272---17.01-- 0.00SMTNoOur Baseline with Preordering
56Senseja-en2015/09/01 17:42:28860---16.96-- 0.00SMTYesPassive JSTx1
57ORGANIZERja-en2016/07/26 11:37:381042---16.91---OtherYesOnline D (2016)
58ORGANIZERja-en2015/08/25 18:57:25775---16.85-- 0.00OtherYesOnline D (2015)
59Senseja-en2015/08/18 21:54:39702---16.72-- 0.00SMTYesPassive JSTx1
60Senseja-en2015/09/01 17:42:58861---16.61-- 0.00SMTYesPervasive JSTx1
61Senseja-en2015/08/18 21:58:08708---16.49-- 0.00SMTYesPervasive JSTx1
62TMUja-en2014/09/07 23:32:49301---15.95-- 0.00SMTNoOur baseline system with another preordering method
63TMUja-en2015/09/01 05:46:50847---15.85-- 0.00SMTNoPBSMT with dependency based phrase segmentation
64TOSHIBAja-en2014/08/29 18:47:44240---15.69-- 0.00RBMTYesRBMT system
65TMUja-en2014/09/07 23:28:04300---15.55-- 0.00SMTNoOur baseline system with preordering method
66TMUja-en2014/09/09 19:14:42307---15.40-- 0.00SMTNoOur baseline system
67ORGANIZERja-en2014/07/23 14:52:3196---15.29-- 0.00OtherYesRBMT D (2014)
68ORGANIZERja-en2015/09/10 14:38:13887---15.29-- 0.00OtherYesRBMT D (2015)
69ORGANIZERja-en2014/07/18 11:08:1335---15.08-- 0.00OtherYesOnline D (2014)
70ORGANIZERja-en2014/07/21 11:53:4876---14.82-- 0.00OtherYesRBMT E
71ORGANIZERja-en2014/07/21 11:57:0879---13.86-- 0.00OtherYesRBMT F
72ORGANIZERja-en2014/07/22 11:22:4087---10.64-- 0.00OtherYesOnline C (2014)
73ORGANIZERja-en2015/09/11 10:54:33892---10.29-- 0.00OtherYesOnline C (2015)

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
1NTTja-en2017/08/01 15:53:151732---0.769430---NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data)
2NTTja-en2017/08/01 04:29:021681---0.768880---NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
3NAISTja-en2016/08/27 00:05:381275---0.767661---SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
4ORGANIZERja-en2017/08/02 01:03:081736---0.767577---NMTNoGoogle's "Attention Is All You Need"
5Kyoto-Uja-en2017/08/01 16:38:211733---0.765464---NMTNoEnsemble of 4 BPE, averaged Coverage penalty
6NTTja-en2017/07/30 20:43:071616---0.764831---NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
7NTTja-en2017/08/01 14:50:511724---0.763248---NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data)
8NAISTja-en2016/08/09 16:14:051122---0.762712---SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
9Kyoto-Uja-en2017/08/01 13:42:251717---0.761403---NMTNoEnsemble of 4 BPE averaged parameters
10NAISTja-en2016/08/20 15:33:121247---0.756956---SMTNoNeural MT w/ Lexicon 6 Ensemble
11Kyoto-Uja-en2016/08/18 15:17:051182---0.756601---NMTNoEnsemble of 4 single-layer model (30k voc)
12Kyoto-Uja-en2016/08/20 15:07:471246---0.750802---NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
13NAISTja-en2015/08/14 17:46:43655---0.749573--0.000000SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
14NICT-2ja-en2017/07/26 13:54:281476---0.747335---NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
15TMUja-en2017/08/04 11:16:361750---0.744928---NMTNobeam_size: 10, ensemble of different dropout rates.
16NAISTja-en2015/08/24 23:53:53757---0.743771--0.000000SMTNoTravatar System with NeuralMT Reranking
17CUNIja-en2017/07/31 22:30:521665---0.741699---NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
18NICT-2ja-en2017/07/26 14:04:381480---0.741329---NMTNoNMT 6 Ensembles * Bi-directional Reranking
19TMUja-en2017/08/01 11:35:081703---0.741175---NMTNobaseline system with beam20
20TMUja-en2017/08/01 12:14:411712---0.735908---NMTNothe ensemble system of different dropout rate.
21ORGANIZERja-en2016/11/16 10:29:511333---0.733483---NMTYesOnline D (2016/11/14)
22TMUja-en2017/08/01 11:10:491695---0.725284---NMTNoour baseline system in 2017
23Kyoto-Uja-en2015/08/30 13:02:18829---0.724555--0.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
24NAISTja-en2014/08/01 17:35:16125---0.723670--0.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
25NAISTja-en2014/07/31 11:40:53119---0.723541--0.000000SMTNoTravatar-based Forest-to-String SMT System
26NAISTja-en2015/08/25 13:02:45766---0.722798--0.000000SMTNoTravatar System with Parser Self Training
27NAISTja-en2014/07/19 01:04:4846---0.722599--0.000000SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
28TOSHIBAja-en2015/07/28 16:44:27529---0.718540--0.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
29TOSHIBAja-en2015/07/23 15:00:12506---0.715795--0.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
30NAISTja-en2015/08/25 13:03:48767---0.713083--0.000000SMTNoTravatar System Baseline
31TMUja-en2016/08/20 14:31:481234---0.711542---NMTNo 6 ensemble
32TMUja-en2016/08/20 07:39:021222---0.710613---NMTNo2016 our proposed method to control output voice
33NICT-2ja-en2016/08/05 17:50:251104---0.708808---SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
34TOSHIBAja-en2014/08/29 18:48:24241---0.707936--0.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
35EIWAja-en2014/07/30 16:07:14116---0.706686--0.000000SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
36Kyoto-Uja-en2015/08/27 14:40:32796---0.706480--0.000000EBMTNoKyotoEBMT system without reranking
37Kyoto-Uja-en2016/08/19 01:31:011189---0.705700---EBMTNoKyotoEBMT 2016 w/o reranking
38Kyoto-Uja-en2014/08/31 23:36:50256---0.701154--0.000000EBMTNoOur new baseline system after several modifications.
39Kyoto-Uja-en2014/09/01 10:27:54262---0.698953--0.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
40bjtu_nlpja-en2016/08/17 19:51:241168---0.690455---NMTNoRNN Encoder-Decoder with attention mechanism, single model
41Kyoto-Uja-en2014/08/19 10:24:06136---0.689829--0.000000EBMTNoOur baseline system using 3M parallel sentences.
42TOSHIBAja-en2014/08/29 18:47:44240---0.687122--0.000000RBMTYesRBMT system
43NICTja-en2015/07/17 11:02:10492---0.684485--0.000000SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
44ORGANIZERja-en2014/07/23 14:52:3196---0.683378--0.000000OtherYesRBMT D (2014)
45ORGANIZERja-en2015/09/10 14:38:13887---0.683378--0.000000OtherYesRBMT D (2015)
46ORGANIZERja-en2014/07/11 19:59:559---0.678253--0.000000SMTNoString-to-Tree SMT (2014)
47ORGANIZERja-en2015/09/10 13:41:03877---0.678253--0.000000SMTNoString-to-Tree SMT (2015)
48ORGANIZERja-en2016/07/26 11:37:381042---0.677412---OtherYesOnline D (2016)
49ORGANIZERja-en2015/08/25 18:57:25775---0.676609--0.000000OtherYesOnline D (2015)
50ORGANIZERja-en2014/07/21 11:53:4876---0.663851--0.000000OtherYesRBMT E
51ORGANIZERja-en2014/07/21 11:57:0879---0.661387--0.000000OtherYesRBMT F
52NICTja-en2015/07/16 13:27:58488---0.659883--0.000000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
53ORGANIZERja-en2014/07/11 19:45:322---0.651066--0.000000SMTNoHierarchical Phrase-based SMT (2014)
54TMUja-en2014/09/07 23:32:49301---0.648879--0.000000SMTNoOur baseline system with another preordering method
55Senseja-en2014/08/23 05:34:03164---0.646204--0.000000SMTNoParaphrase max10
56ORGANIZERja-en2014/07/11 19:49:576---0.645137--0.000000SMTNoPhrase-based SMT
57TMUja-en2014/09/07 23:28:04300---0.644698--0.000000SMTNoOur baseline system with preordering method
58ORGANIZERja-en2014/07/18 11:08:1335---0.643588--0.000000OtherYesOnline D (2014)
59TMUja-en2015/08/04 16:32:20578---0.641456--0.000000SMTNoOur PBSMT baseline (2015)
60NICTja-en2015/07/17 08:51:45489---0.639711--0.000000SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
61Senseja-en2015/08/29 04:32:36824---0.633073--0.000000SMTNoBaseline-2015 (train123)
62NIIja-en2014/09/02 11:42:01271---0.630825--0.000000SMTNoOur Baseline
63Senseja-en2015/08/28 19:25:25822---0.629066--0.000000SMTNoBaseline-2015 (train1 only)
64TMUja-en2015/09/01 05:46:50847---0.628897--0.000000SMTNoPBSMT with dependency based phrase segmentation
65Senseja-en2015/07/28 22:23:43530---0.627006--0.000000SMTNoBaseline-2015
66ORGANIZERja-en2014/07/22 11:22:4087---0.624827--0.000000OtherYesOnline C (2014)
67ORGANIZERja-en2015/09/11 10:54:33892---0.622564--0.000000OtherYesOnline C (2015)
68TMUja-en2014/09/09 19:14:42307---0.613119--0.000000SMTNoOur baseline system
69NIIja-en2014/09/02 11:42:53272---0.610833--0.000000SMTNoOur Baseline with Preordering
70Senseja-en2015/09/01 17:42:28860---0.610775--0.000000SMTYesPassive JSTx1
71Senseja-en2015/08/18 21:54:39702---0.609632--0.000000SMTYesPassive JSTx1
72Senseja-en2015/09/01 17:42:58861---0.609008--0.000000SMTYesPervasive JSTx1
73Senseja-en2015/08/18 21:58:08708---0.600806--0.000000SMTYesPervasive JSTx1

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
1NAISTja-en2015/08/14 17:46:436550.0000000.0000000.0000000.6094300.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
2NAISTja-en2015/08/24 23:53:537570.0000000.0000000.0000000.6066000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
3TOSHIBAja-en2015/07/23 15:00:125060.0000000.0000000.0000000.6047600.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
4NAISTja-en2014/07/19 01:04:48460.0000000.0000000.0000000.6041800.0000000.0000000.000000SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
5NAISTja-en2014/07/31 11:40:531190.0000000.0000000.0000000.6034900.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
6Kyoto-Uja-en2015/08/30 13:02:188290.0000000.0000000.0000000.6032100.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
7NAISTja-en2014/08/01 17:35:161250.0000000.0000000.0000000.6027800.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
8NAISTja-en2015/08/25 13:03:487670.0000000.0000000.0000000.6000000.0000000.0000000.000000SMTNoTravatar System Baseline
9NTTja-en2017/08/01 15:53:151732---0.599920---NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data)
10NAISTja-en2015/08/25 13:02:457660.0000000.0000000.0000000.5998000.0000000.0000000.000000SMTNoTravatar System with Parser Self Training
11NTTja-en2017/08/01 04:29:021681---0.597860---NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
12TOSHIBAja-en2015/07/28 16:44:275290.0000000.0000000.0000000.5978300.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
13NTTja-en2017/07/30 20:43:071616---0.597620---NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
14NTTja-en2017/08/01 14:50:511724---0.597470---NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data)
15Kyoto-Uja-en2015/08/27 14:40:327960.0000000.0000000.0000000.5964300.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
16TMUja-en2017/08/04 11:16:361750---0.596360---NMTNobeam_size: 10, ensemble of different dropout rates.
17NICT-2ja-en2016/08/05 17:50:251104---0.595930---SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
18ORGANIZERja-en2017/08/02 01:03:081736---0.595580---NMTNoGoogle's "Attention Is All You Need"
19TMUja-en2017/08/01 11:35:081703---0.595260---NMTNobaseline system with beam20
20Kyoto-Uja-en2016/08/19 01:31:011189---0.595240---EBMTNoKyotoEBMT 2016 w/o reranking
21NAISTja-en2016/08/27 00:05:381275---0.594150---SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
22Kyoto-Uja-en2014/08/19 10:24:061360.0000000.0000000.0000000.5939700.0000000.0000000.000000EBMTNoOur baseline system using 3M parallel sentences.
23Kyoto-Uja-en2014/08/31 23:36:502560.0000000.0000000.0000000.5936600.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
24ORGANIZERja-en2014/07/11 19:59:5590.0000000.0000000.0000000.5934100.0000000.0000000.000000SMTNoString-to-Tree SMT (2014)
25ORGANIZERja-en2015/09/10 13:41:038770.0000000.0000000.0000000.5934100.0000000.0000000.000000SMTNoString-to-Tree SMT (2015)
26Senseja-en2015/08/28 19:25:258220.0000000.0000000.0000000.5920400.0000000.0000000.000000SMTNoBaseline-2015 (train1 only)
27Kyoto-Uja-en2017/08/01 16:38:211733---0.591160---NMTNoEnsemble of 4 BPE, averaged Coverage penalty
28TMUja-en2015/08/04 16:32:205780.0000000.0000000.0000000.5909800.0000000.0000000.000000SMTNoOur PBSMT baseline (2015)
29ORGANIZERja-en2014/07/11 19:49:5760.0000000.0000000.0000000.5909500.0000000.0000000.000000SMTNoPhrase-based SMT
30ORGANIZERja-en2014/07/11 19:45:3220.0000000.0000000.0000000.5888800.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
31Kyoto-Uja-en2014/09/01 10:27:542620.0000000.0000000.0000000.5884800.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
32TMUja-en2017/08/01 12:14:411712---0.588360---NMTNothe ensemble system of different dropout rate.
33NICTja-en2015/07/16 13:27:584880.0000000.0000000.0000000.5875300.0000000.0000000.000000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
34Senseja-en2014/08/23 05:34:031640.0000000.0000000.0000000.5875200.0000000.0000000.000000SMTNoParaphrase max10
35NAISTja-en2016/08/09 16:14:051122---0.587450---SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
36TMUja-en2017/08/01 11:10:491695---0.585710---NMTNoour baseline system in 2017
37Kyoto-Uja-en2017/08/01 13:42:251717---0.585540---NMTNoEnsemble of 4 BPE averaged parameters
38ORGANIZERja-en2016/11/16 10:29:511333---0.584390---NMTYesOnline D (2016/11/14)
39CUNIja-en2017/07/31 22:30:521665---0.583780---NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
40NIIja-en2014/09/02 11:42:012710.0000000.0000000.0000000.5828000.0000000.0000000.000000SMTNoOur Baseline
41Senseja-en2015/09/01 17:42:588610.0000000.0000000.0000000.5823700.0000000.0000000.000000SMTYesPervasive JSTx1
42Senseja-en2015/08/18 21:58:087080.0000000.0000000.0000000.5815400.0000000.0000000.000000SMTYesPervasive JSTx1
43TMUja-en2014/09/09 19:14:423070.0000000.0000000.0000000.5805000.0000000.0000000.000000SMTNoOur baseline system
44Senseja-en2015/09/01 17:42:288600.0000000.0000000.0000000.5797900.0000000.0000000.000000SMTYesPassive JSTx1
45Senseja-en2015/08/29 04:32:368240.0000000.0000000.0000000.5792800.0000000.0000000.000000SMTNoBaseline-2015 (train123)
46Senseja-en2015/08/18 21:54:397020.0000000.0000000.0000000.5792100.0000000.0000000.000000SMTYesPassive JSTx1
47TMUja-en2014/09/07 23:32:493010.0000000.0000000.0000000.5783600.0000000.0000000.000000SMTNoOur baseline system with another preordering method
48NICT-2ja-en2017/07/26 14:04:381480---0.578150---NMTNoNMT 6 Ensembles * Bi-directional Reranking
49EIWAja-en2014/07/30 16:07:141160.0000000.0000000.0000000.5765400.0000000.0000000.000000SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
50NICTja-en2015/07/17 11:02:104920.0000000.0000000.0000000.5765100.0000000.0000000.000000SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
51NICT-2ja-en2017/07/26 13:54:281476---0.574810---NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
52NIIja-en2014/09/02 11:42:532720.0000000.0000000.0000000.5740300.0000000.0000000.000000SMTNoOur Baseline with Preordering
53NAISTja-en2016/08/20 15:33:121247---0.571360---SMTNoNeural MT w/ Lexicon 6 Ensemble
54TMUja-en2015/09/01 05:46:508470.0000000.0000000.0000000.5704300.0000000.0000000.000000SMTNoPBSMT with dependency based phrase segmentation
55TMUja-en2016/08/20 07:39:021222---0.565270---NMTNo2016 our proposed method to control output voice
56Senseja-en2015/07/28 22:23:435300.0000000.0000000.0000000.5646100.0000000.0000000.000000SMTNoBaseline-2015
57ORGANIZERja-en2016/07/26 11:37:381042---0.564270---OtherYesOnline D (2016)
58ORGANIZERja-en2014/07/18 11:08:13350.0000000.0000000.0000000.5641700.0000000.0000000.000000OtherYesOnline D (2014)
59NICTja-en2015/07/17 08:51:454890.0000000.0000000.0000000.5629200.0000000.0000000.000000SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
60Kyoto-Uja-en2016/08/20 15:07:471246---0.562650---NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
61ORGANIZERja-en2015/08/25 18:57:257750.0000000.0000000.0000000.5622700.0000000.0000000.000000OtherYesOnline D (2015)
62ORGANIZERja-en2014/07/21 11:53:48760.0000000.0000000.0000000.5616200.0000000.0000000.000000OtherYesRBMT E
63TMUja-en2014/09/07 23:28:043000.0000000.0000000.0000000.5614500.0000000.0000000.000000SMTNoOur baseline system with preordering method
64Kyoto-Uja-en2016/08/18 15:17:051182---0.558540---NMTNoEnsemble of 4 single-layer model (30k voc)
65ORGANIZERja-en2014/07/21 11:57:08790.0000000.0000000.0000000.5568400.0000000.0000000.000000OtherYesRBMT F
66TOSHIBAja-en2014/08/29 18:47:442400.0000000.0000000.0000000.5529800.0000000.0000000.000000RBMTYesRBMT system
67TOSHIBAja-en2014/08/29 18:48:242410.0000000.0000000.0000000.5517400.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
68ORGANIZERja-en2014/07/23 14:52:31960.0000000.0000000.0000000.5516900.0000000.0000000.000000OtherYesRBMT D (2014)
69ORGANIZERja-en2015/09/10 14:38:138870.0000000.0000000.0000000.5516900.0000000.0000000.000000OtherYesRBMT D (2015)
70TMUja-en2016/08/20 14:31:481234---0.546880---NMTNo 6 ensemble
71bjtu_nlpja-en2016/08/17 19:51:241168---0.505730---NMTNoRNN Encoder-Decoder with attention mechanism, single model
72ORGANIZERja-en2014/07/22 11:22:40870.0000000.0000000.0000000.4664800.0000000.0000000.000000OtherYesOnline C (2014)
73ORGANIZERja-en2015/09/11 10:54:338920.0000000.0000000.0000000.4533700.0000000.0000000.000000OtherYesOnline C (2015)

Notice:

Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-Uja-en2017/08/01 13:42:25171777.750NMTNoEnsemble of 4 BPE averaged parameters
2NTTja-en2017/08/01 04:29:02168177.250NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
3ORGANIZERja-en2017/08/02 01:03:08173675.250NMTNoGoogle's "Attention Is All You Need"
4NTTja-en2017/07/30 20:43:07161675.000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
5Kyoto-Uja-en2017/08/01 16:38:21173374.500NMTNoEnsemble of 4 BPE, averaged Coverage penalty
6NICT-2ja-en2017/07/26 14:04:38148069.750NMTNoNMT 6 Ensembles * Bi-directional Reranking
7NICT-2ja-en2017/07/26 13:54:28147668.750NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
8CUNIja-en2017/07/31 22:30:52166566.000NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
9TMUja-en2017/08/01 11:35:08170361.000NMTNobaseline system with beam20
10TMUja-en2017/08/01 11:10:49169556.750NMTNoour baseline system in 2017

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1ORGANIZERja-en2016/11/16 10:29:51133363.000NMTYesOnline D (2016/11/14)
2NAISTja-en2016/08/09 16:14:05112248.250SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
3NAISTja-en2016/08/20 15:33:12124747.500SMTNoNeural MT w/ Lexicon 6 Ensemble
4Kyoto-Uja-en2016/08/20 15:07:47124647.000NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
5Kyoto-Uja-en2016/08/18 15:17:05118244.250NMTNoEnsemble of 4 single-layer model (30k voc)
6ORGANIZERja-en2016/07/26 11:37:38104228.000OtherYesOnline D (2016)
7TMUja-en2016/08/20 14:31:48123425.000NMTNo 6 ensemble
8bjtu_nlpja-en2016/08/17 19:51:24116819.250NMTNoRNN Encoder-Decoder with attention mechanism, single model
9TMUja-en2016/08/20 07:39:02122216.000NMTNo2016 our proposed method to control output voice
10NICT-2ja-en2016/08/05 17:50:251104UnderwaySMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NAISTja-en2015/08/14 17:46:4365535.500SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
2Kyoto-Uja-en2015/08/30 13:02:1882932.500EBMTNoKyotoEBMT system with bilingual RNNLM reranking
3TOSHIBAja-en2015/07/28 16:44:2752925.000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
4TOSHIBAja-en2015/07/23 15:00:1250621.250SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
5ORGANIZERja-en2015/09/10 14:38:1388716.750OtherYesRBMT D (2015)
6Kyoto-Uja-en2015/08/27 14:40:3279616.500EBMTNoKyotoEBMT system without reranking
7NICTja-en2015/07/16 13:27:5848816.000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
8NAISTja-en2015/08/25 13:02:4576611.750SMTNoTravatar System with Parser Self Training
9ORGANIZERja-en2015/09/10 13:41:038777.000SMTNoString-to-Tree SMT (2015)
10NICTja-en2015/07/17 11:02:104926.500SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
11ORGANIZERja-en2015/08/25 18:57:257750.250OtherYesOnline D (2015)
12Senseja-en2015/09/01 17:42:28860-7.750SMTYesPassive JSTx1
13Senseja-en2015/09/01 17:42:58861-12.750SMTYesPervasive JSTx1
14TMUja-en2015/09/01 05:46:50847-25.500SMTNoPBSMT with dependency based phrase segmentation

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NAISTja-en2014/07/19 01:04:484640.500SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
2NAISTja-en2014/07/31 11:40:5311937.500SMTNoTravatar-based Forest-to-String SMT System
3ORGANIZERja-en2014/07/11 19:59:55925.500SMTNoString-to-Tree SMT (2014)
4Kyoto-Uja-en2014/09/01 10:27:5426225.000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
5TOSHIBAja-en2014/08/29 18:48:2424123.250SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
6ORGANIZERja-en2014/07/23 14:52:319623.000OtherYesRBMT D (2014)
7EIWAja-en2014/07/30 16:07:1411622.500SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
8Kyoto-Uja-en2014/08/31 23:36:5025621.250EBMTNoOur new baseline system after several modifications.
9TOSHIBAja-en2014/08/29 18:47:4424020.250RBMTYesRBMT system
10ORGANIZERja-en2014/07/18 11:08:133513.750OtherYesOnline D (2014)
11ORGANIZERja-en2014/07/11 19:45:3227.750SMTNoHierarchical Phrase-based SMT (2014)
12Senseja-en2014/08/23 05:34:031641.250SMTNoParaphrase max10
13NIIja-en2014/09/02 11:42:01271-5.750SMTNoOur Baseline
14NIIja-en2014/09/02 11:42:53272-14.250SMTNoOur Baseline with Preordering
15TMUja-en2014/09/07 23:32:49301-17.000SMTNoOur baseline system with another preordering method
16TMUja-en2014/09/07 23:28:04300-17.250SMTNoOur baseline system with preordering method

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

JST (Japan Science and Technology Agency)
NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2017-07-05