JST_LOGO.JPG NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
1NAISTja-en2016/08/27 00:05:38---27.55---SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
2NAISTja-en2016/08/09 16:14:05---26.39---SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
3Kyoto-Uja-en2016/08/18 15:17:05---26.22---NMTNoEnsemble of 4 single-layer model (30k voc)
4NAISTja-en2016/08/20 15:33:12---26.12---SMTNoNeural MT w/ Lexicon 6 Ensemble
5NAISTja-en2015/08/14 17:46:43---25.41-- 0.00SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
6NAISTja-en2015/08/24 23:53:53---24.77-- 0.00SMTNoTravatar System with NeuralMT Reranking
7Kyoto-Uja-en2016/08/20 15:07:47---24.71---NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
8NAISTja-en2014/07/19 01:04:48---23.82-- 0.00SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
9NAISTja-en2014/08/01 17:35:16---23.47-- 0.00SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
10NAISTja-en2014/07/31 11:40:53---23.29-- 0.00SMTNoTravatar-based Forest-to-String SMT System
11TOSHIBAja-en2015/07/23 15:00:12---23.00-- 0.00SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
12TOSHIBAja-en2015/07/28 16:44:27---22.89-- 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
13Kyoto-Uja-en2015/08/30 13:02:18---22.89-- 0.00EBMTNoKyotoEBMT system with bilingual RNNLM reranking
14NAISTja-en2015/08/25 13:02:45---22.62-- 0.00SMTNoTravatar System with Parser Self Training
15NAISTja-en2015/08/25 13:03:48---22.16-- 0.00SMTNoTravatar System Baseline
16ORGANIZERja-en2016/11/16 10:29:51---22.04---NMTYesOnline D (2016/11/14)
17NICT-2ja-en2016/08/05 17:50:25---21.54---SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
18Kyoto-Uja-en2015/08/27 14:40:32---21.31-- 0.00EBMTNoKyotoEBMT system without reranking
19Kyoto-Uja-en2016/08/19 01:31:01---21.22---EBMTNoKyotoEBMT 2016 w/o reranking
20Kyoto-Uja-en2014/09/01 10:27:54---21.07-- 0.00EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
21TOSHIBAja-en2014/08/29 18:48:24---20.61-- 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
22Kyoto-Uja-en2014/08/31 23:36:50---20.60-- 0.00EBMTNoOur new baseline system after several modifications.
23ORGANIZERja-en2014/07/11 19:59:55---20.36-- 0.00SMTNoString-to-Tree SMT (2014)
24ORGANIZERja-en2015/09/10 13:41:03---20.36-- 0.00SMTNoString-to-Tree SMT (2015)
25Kyoto-Uja-en2014/08/19 10:24:06---20.02-- 0.00EBMTNoOur baseline system using 3M parallel sentences.
26EIWAja-en2014/07/30 16:07:14---19.86-- 0.00SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
27NICTja-en2015/07/16 13:27:58---18.98-- 0.00SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
28NICTja-en2015/07/17 11:02:10---18.96-- 0.00SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
29Senseja-en2014/08/23 05:34:03---18.82-- 0.00SMTNoParaphrase max10
30ORGANIZERja-en2014/07/11 19:45:32---18.72-- 0.00SMTNoHierarchical Phrase-based SMT (2014)
31ORGANIZERja-en2014/07/11 19:49:57---18.45-- 0.00SMTNoPhrase-based SMT
32TMUja-en2016/08/20 14:31:48---18.45---NMTNo 6 ensemble
33bjtu_nlpja-en2016/08/17 19:51:24---18.34---NMTNoRNN Encoder-Decoder with attention mechanism, single model
34TMUja-en2015/08/04 16:32:20---18.32-- 0.00SMTNoOur PBSMT baseline (2015)
35TMUja-en2016/08/20 07:39:02---18.29---NMTNo2016 our proposed method to control output voice
36Senseja-en2015/08/28 19:25:25---18.20-- 0.00SMTNoBaseline-2015 (train1 only)
37NICTja-en2015/07/17 08:51:45---18.09-- 0.00SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
38Senseja-en2015/08/29 04:32:36---18.09-- 0.00SMTNoBaseline-2015 (train123)
39NIIja-en2014/09/02 11:42:01---17.47-- 0.00SMTNoOur Baseline
40Senseja-en2015/07/28 22:23:43---17.04-- 0.00SMTNoBaseline-2015
41NIIja-en2014/09/02 11:42:53---17.01-- 0.00SMTNoOur Baseline with Preordering
42Senseja-en2015/09/01 17:42:28---16.96-- 0.00SMTYesPassive JSTx1
43ORGANIZERja-en2016/07/26 11:37:38---16.91---OtherYesOnline D (2016)
44ORGANIZERja-en2015/08/25 18:57:25---16.85-- 0.00OtherYesOnline D (2015)
45Senseja-en2015/08/18 21:54:39---16.72-- 0.00SMTYesPassive JSTx1
46Senseja-en2015/09/01 17:42:58---16.61-- 0.00SMTYesPervasive JSTx1
47Senseja-en2015/08/18 21:58:08---16.49-- 0.00SMTYesPervasive JSTx1
48TMUja-en2014/09/07 23:32:49---15.95-- 0.00SMTNoOur baseline system with another preordering method
49TMUja-en2015/09/01 05:46:50---15.85-- 0.00SMTNoPBSMT with dependency based phrase segmentation
50TOSHIBAja-en2014/08/29 18:47:44---15.69-- 0.00RBMTYesRBMT system
51TMUja-en2014/09/07 23:28:04---15.55-- 0.00SMTNoOur baseline system with preordering method
52TMUja-en2014/09/09 19:14:42---15.40-- 0.00SMTNoOur baseline system
53ORGANIZERja-en2014/07/23 14:52:31---15.29-- 0.00OtherYesRBMT D (2014)
54ORGANIZERja-en2015/09/10 14:38:13---15.29-- 0.00OtherYesRBMT D (2015)
55ORGANIZERja-en2014/07/18 11:08:13---15.08-- 0.00OtherYesOnline D (2014)
56ORGANIZERja-en2014/07/21 11:53:48---14.82-- 0.00OtherYesRBMT E
57ORGANIZERja-en2014/07/21 11:57:08---13.86-- 0.00OtherYesRBMT F
58ORGANIZERja-en2014/07/22 11:22:40---10.64-- 0.00OtherYesOnline C (2014)
59ORGANIZERja-en2015/09/11 10:54:33---10.29-- 0.00OtherYesOnline C (2015)

Notice:

Back to top

RIBES


# Team Task Date/Time RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
1NAISTja-en2016/08/27 00:05:38---0.767661---SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
2NAISTja-en2016/08/09 16:14:05---0.762712---SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
3NAISTja-en2016/08/20 15:33:12---0.756956---SMTNoNeural MT w/ Lexicon 6 Ensemble
4Kyoto-Uja-en2016/08/18 15:17:05---0.756601---NMTNoEnsemble of 4 single-layer model (30k voc)
5Kyoto-Uja-en2016/08/20 15:07:47---0.750802---NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
6NAISTja-en2015/08/14 17:46:43---0.749573--0.000000SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
7NAISTja-en2015/08/24 23:53:53---0.743771--0.000000SMTNoTravatar System with NeuralMT Reranking
8ORGANIZERja-en2016/11/16 10:29:51---0.733483---NMTYesOnline D (2016/11/14)
9Kyoto-Uja-en2015/08/30 13:02:18---0.724555--0.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
10NAISTja-en2014/08/01 17:35:16---0.723670--0.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
11NAISTja-en2014/07/31 11:40:53---0.723541--0.000000SMTNoTravatar-based Forest-to-String SMT System
12NAISTja-en2015/08/25 13:02:45---0.722798--0.000000SMTNoTravatar System with Parser Self Training
13NAISTja-en2014/07/19 01:04:48---0.722599--0.000000SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
14TOSHIBAja-en2015/07/28 16:44:27---0.718540--0.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
15TOSHIBAja-en2015/07/23 15:00:12---0.715795--0.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
16NAISTja-en2015/08/25 13:03:48---0.713083--0.000000SMTNoTravatar System Baseline
17TMUja-en2016/08/20 14:31:48---0.711542---NMTNo 6 ensemble
18TMUja-en2016/08/20 07:39:02---0.710613---NMTNo2016 our proposed method to control output voice
19NICT-2ja-en2016/08/05 17:50:25---0.708808---SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
20TOSHIBAja-en2014/08/29 18:48:24---0.707936--0.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
21EIWAja-en2014/07/30 16:07:14---0.706686--0.000000SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
22Kyoto-Uja-en2015/08/27 14:40:32---0.706480--0.000000EBMTNoKyotoEBMT system without reranking
23Kyoto-Uja-en2016/08/19 01:31:01---0.705700---EBMTNoKyotoEBMT 2016 w/o reranking
24Kyoto-Uja-en2014/08/31 23:36:50---0.701154--0.000000EBMTNoOur new baseline system after several modifications.
25Kyoto-Uja-en2014/09/01 10:27:54---0.698953--0.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
26bjtu_nlpja-en2016/08/17 19:51:24---0.690455---NMTNoRNN Encoder-Decoder with attention mechanism, single model
27Kyoto-Uja-en2014/08/19 10:24:06---0.689829--0.000000EBMTNoOur baseline system using 3M parallel sentences.
28TOSHIBAja-en2014/08/29 18:47:44---0.687122--0.000000RBMTYesRBMT system
29NICTja-en2015/07/17 11:02:10---0.684485--0.000000SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
30ORGANIZERja-en2014/07/23 14:52:31---0.683378--0.000000OtherYesRBMT D (2014)
31ORGANIZERja-en2015/09/10 14:38:13---0.683378--0.000000OtherYesRBMT D (2015)
32ORGANIZERja-en2014/07/11 19:59:55---0.678253--0.000000SMTNoString-to-Tree SMT (2014)
33ORGANIZERja-en2015/09/10 13:41:03---0.678253--0.000000SMTNoString-to-Tree SMT (2015)
34ORGANIZERja-en2016/07/26 11:37:38---0.677412---OtherYesOnline D (2016)
35ORGANIZERja-en2015/08/25 18:57:25---0.676609--0.000000OtherYesOnline D (2015)
36ORGANIZERja-en2014/07/21 11:53:48---0.663851--0.000000OtherYesRBMT E
37ORGANIZERja-en2014/07/21 11:57:08---0.661387--0.000000OtherYesRBMT F
38NICTja-en2015/07/16 13:27:58---0.659883--0.000000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
39ORGANIZERja-en2014/07/11 19:45:32---0.651066--0.000000SMTNoHierarchical Phrase-based SMT (2014)
40TMUja-en2014/09/07 23:32:49---0.648879--0.000000SMTNoOur baseline system with another preordering method
41Senseja-en2014/08/23 05:34:03---0.646204--0.000000SMTNoParaphrase max10
42ORGANIZERja-en2014/07/11 19:49:57---0.645137--0.000000SMTNoPhrase-based SMT
43TMUja-en2014/09/07 23:28:04---0.644698--0.000000SMTNoOur baseline system with preordering method
44ORGANIZERja-en2014/07/18 11:08:13---0.643588--0.000000OtherYesOnline D (2014)
45TMUja-en2015/08/04 16:32:20---0.641456--0.000000SMTNoOur PBSMT baseline (2015)
46NICTja-en2015/07/17 08:51:45---0.639711--0.000000SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
47Senseja-en2015/08/29 04:32:36---0.633073--0.000000SMTNoBaseline-2015 (train123)
48NIIja-en2014/09/02 11:42:01---0.630825--0.000000SMTNoOur Baseline
49Senseja-en2015/08/28 19:25:25---0.629066--0.000000SMTNoBaseline-2015 (train1 only)
50TMUja-en2015/09/01 05:46:50---0.628897--0.000000SMTNoPBSMT with dependency based phrase segmentation
51Senseja-en2015/07/28 22:23:43---0.627006--0.000000SMTNoBaseline-2015
52ORGANIZERja-en2014/07/22 11:22:40---0.624827--0.000000OtherYesOnline C (2014)
53ORGANIZERja-en2015/09/11 10:54:33---0.622564--0.000000OtherYesOnline C (2015)
54TMUja-en2014/09/09 19:14:42---0.613119--0.000000SMTNoOur baseline system
55NIIja-en2014/09/02 11:42:53---0.610833--0.000000SMTNoOur Baseline with Preordering
56Senseja-en2015/09/01 17:42:28---0.610775--0.000000SMTYesPassive JSTx1
57Senseja-en2015/08/18 21:54:39---0.609632--0.000000SMTYesPassive JSTx1
58Senseja-en2015/09/01 17:42:58---0.609008--0.000000SMTYesPervasive JSTx1
59Senseja-en2015/08/18 21:58:08---0.600806--0.000000SMTYesPervasive JSTx1

Notice:

Back to top

AMFM


# Team Task Date/Time AMFM
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
1NAISTja-en2015/08/14 17:46:430.0000000.0000000.0000000.6094300.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
2NAISTja-en2015/08/24 23:53:530.0000000.0000000.0000000.6066000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
3TOSHIBAja-en2015/07/23 15:00:120.0000000.0000000.0000000.6047600.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
4NAISTja-en2014/07/19 01:04:480.0000000.0000000.0000000.6041800.0000000.0000000.000000SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
5NAISTja-en2014/07/31 11:40:530.0000000.0000000.0000000.6034900.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
6Kyoto-Uja-en2015/08/30 13:02:180.0000000.0000000.0000000.6032100.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
7NAISTja-en2014/08/01 17:35:160.0000000.0000000.0000000.6027800.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
8NAISTja-en2015/08/25 13:03:480.0000000.0000000.0000000.6000000.0000000.0000000.000000SMTNoTravatar System Baseline
9NAISTja-en2015/08/25 13:02:450.0000000.0000000.0000000.5998000.0000000.0000000.000000SMTNoTravatar System with Parser Self Training
10TOSHIBAja-en2015/07/28 16:44:270.0000000.0000000.0000000.5978300.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
11Kyoto-Uja-en2015/08/27 14:40:320.0000000.0000000.0000000.5964300.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
12NICT-2ja-en2016/08/05 17:50:25---0.595930---SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
13Kyoto-Uja-en2016/08/19 01:31:01---0.595240---EBMTNoKyotoEBMT 2016 w/o reranking
14NAISTja-en2016/08/27 00:05:38---0.594150---SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
15Kyoto-Uja-en2014/08/19 10:24:060.0000000.0000000.0000000.5939700.0000000.0000000.000000EBMTNoOur baseline system using 3M parallel sentences.
16Kyoto-Uja-en2014/08/31 23:36:500.0000000.0000000.0000000.5936600.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
17ORGANIZERja-en2014/07/11 19:59:550.0000000.0000000.0000000.5934100.0000000.0000000.000000SMTNoString-to-Tree SMT (2014)
18ORGANIZERja-en2015/09/10 13:41:030.0000000.0000000.0000000.5934100.0000000.0000000.000000SMTNoString-to-Tree SMT (2015)
19Senseja-en2015/08/28 19:25:250.0000000.0000000.0000000.5920400.0000000.0000000.000000SMTNoBaseline-2015 (train1 only)
20TMUja-en2015/08/04 16:32:200.0000000.0000000.0000000.5909800.0000000.0000000.000000SMTNoOur PBSMT baseline (2015)
21ORGANIZERja-en2014/07/11 19:49:570.0000000.0000000.0000000.5909500.0000000.0000000.000000SMTNoPhrase-based SMT
22ORGANIZERja-en2014/07/11 19:45:320.0000000.0000000.0000000.5888800.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
23Kyoto-Uja-en2014/09/01 10:27:540.0000000.0000000.0000000.5884800.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
24NICTja-en2015/07/16 13:27:580.0000000.0000000.0000000.5875300.0000000.0000000.000000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
25Senseja-en2014/08/23 05:34:030.0000000.0000000.0000000.5875200.0000000.0000000.000000SMTNoParaphrase max10
26NAISTja-en2016/08/09 16:14:05---0.587450---SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
27ORGANIZERja-en2016/11/16 10:29:51---0.584390---NMTYesOnline D (2016/11/14)
28NIIja-en2014/09/02 11:42:010.0000000.0000000.0000000.5828000.0000000.0000000.000000SMTNoOur Baseline
29Senseja-en2015/09/01 17:42:580.0000000.0000000.0000000.5823700.0000000.0000000.000000SMTYesPervasive JSTx1
30Senseja-en2015/08/18 21:58:080.0000000.0000000.0000000.5815400.0000000.0000000.000000SMTYesPervasive JSTx1
31TMUja-en2014/09/09 19:14:420.0000000.0000000.0000000.5805000.0000000.0000000.000000SMTNoOur baseline system
32Senseja-en2015/09/01 17:42:280.0000000.0000000.0000000.5797900.0000000.0000000.000000SMTYesPassive JSTx1
33Senseja-en2015/08/29 04:32:360.0000000.0000000.0000000.5792800.0000000.0000000.000000SMTNoBaseline-2015 (train123)
34Senseja-en2015/08/18 21:54:390.0000000.0000000.0000000.5792100.0000000.0000000.000000SMTYesPassive JSTx1
35TMUja-en2014/09/07 23:32:490.0000000.0000000.0000000.5783600.0000000.0000000.000000SMTNoOur baseline system with another preordering method
36EIWAja-en2014/07/30 16:07:140.0000000.0000000.0000000.5765400.0000000.0000000.000000SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
37NICTja-en2015/07/17 11:02:100.0000000.0000000.0000000.5765100.0000000.0000000.000000SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
38NIIja-en2014/09/02 11:42:530.0000000.0000000.0000000.5740300.0000000.0000000.000000SMTNoOur Baseline with Preordering
39NAISTja-en2016/08/20 15:33:12---0.571360---SMTNoNeural MT w/ Lexicon 6 Ensemble
40TMUja-en2015/09/01 05:46:500.0000000.0000000.0000000.5704300.0000000.0000000.000000SMTNoPBSMT with dependency based phrase segmentation
41TMUja-en2016/08/20 07:39:02---0.565270---NMTNo2016 our proposed method to control output voice
42Senseja-en2015/07/28 22:23:430.0000000.0000000.0000000.5646100.0000000.0000000.000000SMTNoBaseline-2015
43ORGANIZERja-en2016/07/26 11:37:38---0.564270---OtherYesOnline D (2016)
44ORGANIZERja-en2014/07/18 11:08:130.0000000.0000000.0000000.5641700.0000000.0000000.000000OtherYesOnline D (2014)
45NICTja-en2015/07/17 08:51:450.0000000.0000000.0000000.5629200.0000000.0000000.000000SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
46Kyoto-Uja-en2016/08/20 15:07:47---0.562650---NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
47ORGANIZERja-en2015/08/25 18:57:250.0000000.0000000.0000000.5622700.0000000.0000000.000000OtherYesOnline D (2015)
48ORGANIZERja-en2014/07/21 11:53:480.0000000.0000000.0000000.5616200.0000000.0000000.000000OtherYesRBMT E
49TMUja-en2014/09/07 23:28:040.0000000.0000000.0000000.5614500.0000000.0000000.000000SMTNoOur baseline system with preordering method
50Kyoto-Uja-en2016/08/18 15:17:05---0.558540---NMTNoEnsemble of 4 single-layer model (30k voc)
51ORGANIZERja-en2014/07/21 11:57:080.0000000.0000000.0000000.5568400.0000000.0000000.000000OtherYesRBMT F
52TOSHIBAja-en2014/08/29 18:47:440.0000000.0000000.0000000.5529800.0000000.0000000.000000RBMTYesRBMT system
53TOSHIBAja-en2014/08/29 18:48:240.0000000.0000000.0000000.5517400.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
54ORGANIZERja-en2014/07/23 14:52:310.0000000.0000000.0000000.5516900.0000000.0000000.000000OtherYesRBMT D (2014)
55ORGANIZERja-en2015/09/10 14:38:130.0000000.0000000.0000000.5516900.0000000.0000000.000000OtherYesRBMT D (2015)
56TMUja-en2016/08/20 14:31:48---0.546880---NMTNo 6 ensemble
57bjtu_nlpja-en2016/08/17 19:51:24---0.505730---NMTNoRNN Encoder-Decoder with attention mechanism, single model
58ORGANIZERja-en2014/07/22 11:22:400.0000000.0000000.0000000.4664800.0000000.0000000.000000OtherYesOnline C (2014)
59ORGANIZERja-en2015/09/11 10:54:330.0000000.0000000.0000000.4533700.0000000.0000000.000000OtherYesOnline C (2015)

Notice:

Back to top

HUMAN (WAT2016)


# Team Task Date/Time HUMAN
Method
Other
Resources
System
Description
1NAISTja-en2016/08/09 16:14:0548.250SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
2NAISTja-en2016/08/20 15:33:1247.500SMTNoNeural MT w/ Lexicon 6 Ensemble
3Kyoto-Uja-en2016/08/20 15:07:4747.000NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
4Kyoto-Uja-en2016/08/18 15:17:0544.250NMTNoEnsemble of 4 single-layer model (30k voc)
5ORGANIZERja-en2016/07/26 11:37:3828.000OtherYesOnline D (2016)
6TMUja-en2016/08/20 14:31:4825.000NMTNo 6 ensemble
7bjtu_nlpja-en2016/08/17 19:51:2419.250NMTNoRNN Encoder-Decoder with attention mechanism, single model
8TMUja-en2016/08/20 07:39:0216.000NMTNo2016 our proposed method to control output voice
9NICT-2ja-en2016/08/05 17:50:25UnderwaySMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time HUMAN
Method
Other
Resources
System
Description
1NAISTja-en2015/08/14 17:46:4335.500SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
2Kyoto-Uja-en2015/08/30 13:02:1832.500EBMTNoKyotoEBMT system with bilingual RNNLM reranking
3TOSHIBAja-en2015/07/28 16:44:2725.000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
4TOSHIBAja-en2015/07/23 15:00:1221.250SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
5ORGANIZERja-en2015/09/10 14:38:1316.750OtherYesRBMT D (2015)
6Kyoto-Uja-en2015/08/27 14:40:3216.500EBMTNoKyotoEBMT system without reranking
7NICTja-en2015/07/16 13:27:5816.000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
8NAISTja-en2015/08/25 13:02:4511.750SMTNoTravatar System with Parser Self Training
9ORGANIZERja-en2015/09/10 13:41:037.000SMTNoString-to-Tree SMT (2015)
10NICTja-en2015/07/17 11:02:106.500SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
11ORGANIZERja-en2015/08/25 18:57:250.250OtherYesOnline D (2015)
12Senseja-en2015/09/01 17:42:28-7.750SMTYesPassive JSTx1
13Senseja-en2015/09/01 17:42:58-12.750SMTYesPervasive JSTx1
14TMUja-en2015/09/01 05:46:50-25.500SMTNoPBSMT with dependency based phrase segmentation

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time HUMAN
Method
Other
Resources
System
Description
1NAISTja-en2014/07/19 01:04:4840.500SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
2NAISTja-en2014/07/31 11:40:5337.500SMTNoTravatar-based Forest-to-String SMT System
3ORGANIZERja-en2014/07/11 19:59:5525.500SMTNoString-to-Tree SMT (2014)
4Kyoto-Uja-en2014/09/01 10:27:5425.000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
5TOSHIBAja-en2014/08/29 18:48:2423.250SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
6ORGANIZERja-en2014/07/23 14:52:3123.000OtherYesRBMT D (2014)
7EIWAja-en2014/07/30 16:07:1422.500SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
8Kyoto-Uja-en2014/08/31 23:36:5021.250EBMTNoOur new baseline system after several modifications.
9TOSHIBAja-en2014/08/29 18:47:4420.250RBMTYesRBMT system
10ORGANIZERja-en2014/07/18 11:08:1313.750OtherYesOnline D (2014)
11ORGANIZERja-en2014/07/11 19:45:327.750SMTNoHierarchical Phrase-based SMT (2014)
12Senseja-en2014/08/23 05:34:031.250SMTNoParaphrase max10
13NIIja-en2014/09/02 11:42:01-5.750SMTNoOur Baseline
14NIIja-en2014/09/02 11:42:53-14.250SMTNoOur Baseline with Preordering
15TMUja-en2014/09/07 23:32:49-17.000SMTNoOur baseline system with another preordering method
16TMUja-en2014/09/07 23:28:04-17.250SMTNoOur baseline system with preordering method

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

JST (Japan Science and Technology Agency)
NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2016-07-07