NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1ORGANIZERja-en2014/07/11 19:45:322---18.72-- 0.00 0.00 0.00 0.00SMTNoHierarchical Phrase-based SMT (2014)
2ORGANIZERja-en2014/07/11 19:49:576---18.45-- 0.00 0.00 0.00 0.00SMTNoPhrase-based SMT
3ORGANIZERja-en2014/07/11 19:59:559---20.36-- 0.00 0.00 0.00 0.00SMTNoString-to-Tree SMT (2014)
4ORGANIZERja-en2014/07/18 11:08:1335---15.08-- 0.00 0.00 0.00 0.00OtherYesOnline D (2014)
5NAISTja-en2014/07/19 01:04:4846---23.82-- 0.00 0.00 0.00 0.00SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
6ORGANIZERja-en2014/07/21 11:53:4876---14.82-- 0.00 0.00 0.00 0.00OtherYesRBMT E
7ORGANIZERja-en2014/07/21 11:57:0879---13.86-- 0.00 0.00 0.00 0.00OtherYesRBMT F
8ORGANIZERja-en2014/07/22 11:22:4087---10.64-- 0.00 0.00 0.00 0.00OtherYesOnline C (2014)
9ORGANIZERja-en2014/07/23 14:52:3196---15.29-- 0.00 0.00 0.00 0.00OtherYesRBMT D (2014)
10EIWAja-en2014/07/30 16:07:14116---19.86-- 0.00 0.00 0.00 0.00SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
11NAISTja-en2014/07/31 11:40:53119---23.29-- 0.00 0.00 0.00 0.00SMTNoTravatar-based Forest-to-String SMT System
12NAISTja-en2014/08/01 17:35:16125---23.47-- 0.00 0.00 0.00 0.00SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
13Kyoto-Uja-en2014/08/19 10:24:06136---20.02-- 0.00 0.00 0.00 0.00EBMTNoOur baseline system using 3M parallel sentences.
14Senseja-en2014/08/23 05:34:03164---18.82-- 0.00 0.00 0.00 0.00SMTNoParaphrase max10
15TOSHIBAja-en2014/08/29 18:47:44240---15.69-- 0.00 0.00 0.00 0.00RBMTYesRBMT system
16TOSHIBAja-en2014/08/29 18:48:24241---20.61-- 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
17Kyoto-Uja-en2014/08/31 23:36:50256---20.60-- 0.00 0.00 0.00 0.00EBMTNoOur new baseline system after several modifications.
18Kyoto-Uja-en2014/09/01 10:27:54262---21.07-- 0.00 0.00 0.00 0.00EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
19NIIja-en2014/09/02 11:42:01271---17.47-- 0.00 0.00 0.00 0.00SMTNoOur Baseline
20NIIja-en2014/09/02 11:42:53272---17.01-- 0.00 0.00 0.00 0.00SMTNoOur Baseline with Preordering
21TMUja-en2014/09/07 23:28:04300---15.55-- 0.00 0.00 0.00 0.00SMTNoOur baseline system with preordering method
22TMUja-en2014/09/07 23:32:49301---15.95-- 0.00 0.00 0.00 0.00SMTNoOur baseline system with another preordering method
23TMUja-en2014/09/09 19:14:42307---15.40-- 0.00 0.00 0.00 0.00SMTNoOur baseline system
24NICTja-en2015/07/16 13:27:58488---18.98-- 0.00 0.00 0.00 0.00SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
25NICTja-en2015/07/17 08:51:45489---18.09-- 0.00 0.00 0.00 0.00SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
26NICTja-en2015/07/17 11:02:10492---18.96-- 0.00 0.00 0.00 0.00SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
27TOSHIBAja-en2015/07/23 15:00:12506---23.00-- 0.00 0.00 0.00 0.00SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
28TOSHIBAja-en2015/07/28 16:44:27529---22.89-- 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
29Senseja-en2015/07/28 22:23:43530---17.04-- 0.00 0.00 0.00 0.00SMTNoBaseline-2015
30TMUja-en2015/08/04 16:32:20578---18.32-- 0.00 0.00 0.00 0.00SMTNoOur PBSMT baseline (2015)
31NAISTja-en2015/08/14 17:46:43655---25.41-- 0.00 0.00 0.00 0.00SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
32Senseja-en2015/08/18 21:54:39702---16.72-- 0.00 0.00 0.00 0.00SMTYesPassive JSTx1
33Senseja-en2015/08/18 21:58:08708---16.49-- 0.00 0.00 0.00 0.00SMTYesPervasive JSTx1
34NAISTja-en2015/08/24 23:53:53757---24.77-- 0.00 0.00 0.00 0.00SMTNoTravatar System with NeuralMT Reranking
35NAISTja-en2015/08/25 13:02:45766---22.62-- 0.00 0.00 0.00 0.00SMTNoTravatar System with Parser Self Training
36NAISTja-en2015/08/25 13:03:48767---22.16-- 0.00 0.00 0.00 0.00SMTNoTravatar System Baseline
37ORGANIZERja-en2015/08/25 18:57:25775---16.85-- 0.00 0.00 0.00 0.00OtherYesOnline D (2015)
38Kyoto-Uja-en2015/08/27 14:40:32796---21.31-- 0.00 0.00 0.00 0.00EBMTNoKyotoEBMT system without reranking
39Senseja-en2015/08/28 19:25:25822---18.20-- 0.00 0.00 0.00 0.00SMTNoBaseline-2015 (train1 only)
40Senseja-en2015/08/29 04:32:36824---18.09-- 0.00 0.00 0.00 0.00SMTNoBaseline-2015 (train123)
41Kyoto-Uja-en2015/08/30 13:02:18829---22.89-- 0.00 0.00 0.00 0.00EBMTNoKyotoEBMT system with bilingual RNNLM reranking
42TMUja-en2015/09/01 05:46:50847---15.85-- 0.00 0.00 0.00 0.00SMTNoPBSMT with dependency based phrase segmentation
43Senseja-en2015/09/01 17:42:28860---16.96-- 0.00 0.00 0.00 0.00SMTYesPassive JSTx1
44Senseja-en2015/09/01 17:42:58861---16.61-- 0.00 0.00 0.00 0.00SMTYesPervasive JSTx1
45ORGANIZERja-en2015/09/10 13:41:03877---20.36-- 0.00 0.00 0.00 0.00SMTNoString-to-Tree SMT (2015)
46ORGANIZERja-en2015/09/10 14:38:13887---15.29-- 0.00 0.00 0.00 0.00OtherYesRBMT D (2015)
47ORGANIZERja-en2015/09/11 10:54:33892---10.29-- 0.00 0.00 0.00 0.00OtherYesOnline C (2015)
48ORGANIZERja-en2016/07/26 11:37:381042---16.91--- 0.00 0.00 0.00OtherYesOnline D (2016)
49NICT-2ja-en2016/08/05 17:50:251104---21.54--- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
50NAISTja-en2016/08/09 16:14:051122---26.39--- 0.00 0.00 0.00SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
51bjtu_nlpja-en2016/08/17 19:51:241168---18.34--- 0.00 0.00 0.00NMTNoRNN Encoder-Decoder with attention mechanism, single model
52Kyoto-Uja-en2016/08/18 15:17:051182---26.22--- 0.00 0.00 0.00NMTNoEnsemble of 4 single-layer model (30k voc)
53Kyoto-Uja-en2016/08/19 01:31:011189---21.22--- 0.00 0.00 0.00EBMTNoKyotoEBMT 2016 w/o reranking
54TMUja-en2016/08/20 07:39:021222---18.29--- 0.00 0.00 0.00NMTNo2016 our proposed method to control output voice
55TMUja-en2016/08/20 14:31:481234---18.45--- 0.00 0.00 0.00NMTNo 6 ensemble
56Kyoto-Uja-en2016/08/20 15:07:471246---24.71--- 0.00 0.00 0.00NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
57NAISTja-en2016/08/20 15:33:121247---26.12--- 0.00 0.00 0.00SMTNoNeural MT w/ Lexicon 6 Ensemble
58NAISTja-en2016/08/27 00:05:381275---27.55--- 0.00 0.00 0.00SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
59ORGANIZERja-en2016/11/16 10:29:511333---22.04--- 0.00 0.00 0.00NMTYesOnline D (2016/11/14)
60NICT-2ja-en2017/07/26 13:54:281476---24.79--- 0.00 0.00 0.00NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
61NICT-2ja-en2017/07/26 14:04:381480---26.76--- 0.00 0.00 0.00NMTNoNMT 6 Ensembles * Bi-directional Reranking
62NTTja-en2017/07/30 20:43:071616---27.43--- 0.00 0.00 0.00NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
63CUNIja-en2017/07/31 22:30:521665---23.43--- 0.00 0.00 0.00NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
64NTTja-en2017/08/01 04:29:021681---28.36--- 0.00 0.00 0.00NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
65TMUja-en2017/08/01 11:10:491695---21.00--- 0.00 0.00 0.00NMTNoour baseline system in 2017
66TMUja-en2017/08/01 11:35:081703---23.03--- 0.00 0.00 0.00NMTNobaseline system with beam20
67TMUja-en2017/08/01 12:14:411712---22.87--- 0.00 0.00 0.00NMTNothe ensemble system of different dropout rate.
68Kyoto-Uja-en2017/08/01 13:42:251717---27.53--- 0.00 0.00 0.00NMTNoEnsemble of 4 BPE averaged parameters
69NTTja-en2017/08/01 14:50:511724---27.62--- 0.00 0.00 0.00NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data)
70NTTja-en2017/08/01 15:53:151732---28.15--- 0.00 0.00 0.00NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data)
71Kyoto-Uja-en2017/08/01 16:38:211733---27.66--- 0.00 0.00 0.00NMTNoEnsemble of 4 BPE, averaged Coverage penalty
72ORGANIZERja-en2017/08/02 01:03:081736---28.06--- 0.00 0.00 0.00NMTNoGoogle's "Attention Is All You Need"
73TMUja-en2017/08/04 11:16:361750---24.55--- 0.00 0.00 0.00NMTNobeam_size: 10, ensemble of different dropout rates.
74ORGANIZERja-en2018/08/14 11:07:471901---26.91---- 0.00 0.00NMTNoNMT with Attention
75srcbja-en2018/08/26 11:09:502152---28.46---- 0.00 0.00NMTNoTransformer, average checkpoints.
76NICT-5ja-en2018/08/27 15:01:052174---28.63---- 0.00 0.00NMTNoTransformer vanilla model using 3M sentences.
77NICT-5ja-en2018/09/10 14:55:372273---29.65---- 0.00 0.00NMTNoMLNMT
78Osaka-Uja-en2018/09/15 23:05:122440---26.19---- 0.00 0.00NMTYesrewarding model
79TMUja-en2018/09/16 11:53:242461---24.94---- 0.00 0.00NMTNoBaseline-NMT ( Single )
80TMUja-en2018/09/16 12:04:362464---25.85---- 0.00 0.00NMTNoEnsemble of 6 Baseline-NMT
81TMUja-en2018/09/16 12:05:472465---25.17---- 0.00 0.00NMTNoGAN-NMT ( Single )
82TMUja-en2018/09/16 12:06:392466---24.98---- 0.00 0.00NMTNoReconstructor-NMT ( Single )
83Osaka-Uja-en2018/09/16 13:11:482472---13.97---- 0.00 0.00SMTNopreordering with neural network
84srcbja-en2018/09/16 14:51:472474---30.59---- 0.00 0.00NMTNoTransformer with relative position, ensemble of 3 models.
85TMUja-en2018/09/16 17:02:152483---25.45---- 0.00 0.00NMTNoEnsemble of 6 NMT ( 2 Baseline + 2 Reconstructor + 2 GAN )
86NICT-5ja-en2019/07/16 17:11:382718---29.12------NMTNoRSNMT 6 layer with distillation
87srcbja-en2019/07/25 11:52:482919---30.62------NMTNoTransformer (Big) with relative position, average checkpoints.
88ykkdja-en2019/07/26 12:26:032989---27.63------NMTNoFully Character-level,6 bi-LSTM (512*2) for Encoder, 6 LSTM for Decoder. Middle dense layer. Beam w/4.Length norm set to 0.2
89NICT-5ja-en2019/07/26 18:02:473041---26.99------NMTNoRSNMT 6 layer
90NICT-2ja-en2019/07/26 22:33:303085---28.61------NMTNoTransformer, sigle model w/ long warm-up and self-training
91NICT-2ja-en2019/07/26 22:37:223086---29.40------NMTNoTransformer, ensemble of 4 models w/ long warm-up and self-training
92KNU_Hyundaija-en2019/07/27 09:28:373173---30.88------NMTNoTransformer Base, relative position, BT, r2l reranking, ensemble of 3 models
93srcbja-en2019/07/27 15:27:013205---30.92------NMTNoTransformer (Big) with relative position, data augmentation, average checkpoints, Bayes re-ranking.
94NTTja-en2019/07/28 11:18:593225---30.56------SMTNoASPEC first 1.5M +last 1.5M (fwd), 6 ensemble
95NTTja-en2019/07/28 15:11:043233---30.28------NMTYesParaCrawl + (ASPEC first 1.5M + Synthetic 1.5M) * 2 oversampling, fine-tune ASPEC, SINGLE MODEL
96AISTAIja-en2019/08/01 10:44:493260---29.01------NMTNoTransformer (big), 1.5M sentences, train_steps=300000, Averaged the last 10 ckpts, by Tensor2Tensor.
97AISTAIja-en2019/08/31 21:28:133361---29.71------NMTNoTransformer, 1.5M sentences, relative position, ensemble of 4 models, by OpenNMT-py.
98NICT-5ja-en2021/03/18 23:22:534574---28.35------NMTNoMy NMT implementation. Beam size 8. LP 0.6

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1ORGANIZERja-en2014/07/11 19:45:322---0.651066--0.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
2ORGANIZERja-en2014/07/11 19:49:576---0.645137--0.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
3ORGANIZERja-en2014/07/11 19:59:559---0.678253--0.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2014)
4ORGANIZERja-en2014/07/18 11:08:1335---0.643588--0.0000000.0000000.0000000.000000OtherYesOnline D (2014)
5NAISTja-en2014/07/19 01:04:4846---0.722599--0.0000000.0000000.0000000.000000SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
6ORGANIZERja-en2014/07/21 11:53:4876---0.663851--0.0000000.0000000.0000000.000000OtherYesRBMT E
7ORGANIZERja-en2014/07/21 11:57:0879---0.661387--0.0000000.0000000.0000000.000000OtherYesRBMT F
8ORGANIZERja-en2014/07/22 11:22:4087---0.624827--0.0000000.0000000.0000000.000000OtherYesOnline C (2014)
9ORGANIZERja-en2014/07/23 14:52:3196---0.683378--0.0000000.0000000.0000000.000000OtherYesRBMT D (2014)
10EIWAja-en2014/07/30 16:07:14116---0.706686--0.0000000.0000000.0000000.000000SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
11NAISTja-en2014/07/31 11:40:53119---0.723541--0.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
12NAISTja-en2014/08/01 17:35:16125---0.723670--0.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
13Kyoto-Uja-en2014/08/19 10:24:06136---0.689829--0.0000000.0000000.0000000.000000EBMTNoOur baseline system using 3M parallel sentences.
14Senseja-en2014/08/23 05:34:03164---0.646204--0.0000000.0000000.0000000.000000SMTNoParaphrase max10
15TOSHIBAja-en2014/08/29 18:47:44240---0.687122--0.0000000.0000000.0000000.000000RBMTYesRBMT system
16TOSHIBAja-en2014/08/29 18:48:24241---0.707936--0.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
17Kyoto-Uja-en2014/08/31 23:36:50256---0.701154--0.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
18Kyoto-Uja-en2014/09/01 10:27:54262---0.698953--0.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
19NIIja-en2014/09/02 11:42:01271---0.630825--0.0000000.0000000.0000000.000000SMTNoOur Baseline
20NIIja-en2014/09/02 11:42:53272---0.610833--0.0000000.0000000.0000000.000000SMTNoOur Baseline with Preordering
21TMUja-en2014/09/07 23:28:04300---0.644698--0.0000000.0000000.0000000.000000SMTNoOur baseline system with preordering method
22TMUja-en2014/09/07 23:32:49301---0.648879--0.0000000.0000000.0000000.000000SMTNoOur baseline system with another preordering method
23TMUja-en2014/09/09 19:14:42307---0.613119--0.0000000.0000000.0000000.000000SMTNoOur baseline system
24NICTja-en2015/07/16 13:27:58488---0.659883--0.0000000.0000000.0000000.000000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
25NICTja-en2015/07/17 08:51:45489---0.639711--0.0000000.0000000.0000000.000000SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
26NICTja-en2015/07/17 11:02:10492---0.684485--0.0000000.0000000.0000000.000000SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
27TOSHIBAja-en2015/07/23 15:00:12506---0.715795--0.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
28TOSHIBAja-en2015/07/28 16:44:27529---0.718540--0.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
29Senseja-en2015/07/28 22:23:43530---0.627006--0.0000000.0000000.0000000.000000SMTNoBaseline-2015
30TMUja-en2015/08/04 16:32:20578---0.641456--0.0000000.0000000.0000000.000000SMTNoOur PBSMT baseline (2015)
31NAISTja-en2015/08/14 17:46:43655---0.749573--0.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
32Senseja-en2015/08/18 21:54:39702---0.609632--0.0000000.0000000.0000000.000000SMTYesPassive JSTx1
33Senseja-en2015/08/18 21:58:08708---0.600806--0.0000000.0000000.0000000.000000SMTYesPervasive JSTx1
34NAISTja-en2015/08/24 23:53:53757---0.743771--0.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
35NAISTja-en2015/08/25 13:02:45766---0.722798--0.0000000.0000000.0000000.000000SMTNoTravatar System with Parser Self Training
36NAISTja-en2015/08/25 13:03:48767---0.713083--0.0000000.0000000.0000000.000000SMTNoTravatar System Baseline
37ORGANIZERja-en2015/08/25 18:57:25775---0.676609--0.0000000.0000000.0000000.000000OtherYesOnline D (2015)
38Kyoto-Uja-en2015/08/27 14:40:32796---0.706480--0.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
39Senseja-en2015/08/28 19:25:25822---0.629066--0.0000000.0000000.0000000.000000SMTNoBaseline-2015 (train1 only)
40Senseja-en2015/08/29 04:32:36824---0.633073--0.0000000.0000000.0000000.000000SMTNoBaseline-2015 (train123)
41Kyoto-Uja-en2015/08/30 13:02:18829---0.724555--0.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
42TMUja-en2015/09/01 05:46:50847---0.628897--0.0000000.0000000.0000000.000000SMTNoPBSMT with dependency based phrase segmentation
43Senseja-en2015/09/01 17:42:28860---0.610775--0.0000000.0000000.0000000.000000SMTYesPassive JSTx1
44Senseja-en2015/09/01 17:42:58861---0.609008--0.0000000.0000000.0000000.000000SMTYesPervasive JSTx1
45ORGANIZERja-en2015/09/10 13:41:03877---0.678253--0.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2015)
46ORGANIZERja-en2015/09/10 14:38:13887---0.683378--0.0000000.0000000.0000000.000000OtherYesRBMT D (2015)
47ORGANIZERja-en2015/09/11 10:54:33892---0.622564--0.0000000.0000000.0000000.000000OtherYesOnline C (2015)
48ORGANIZERja-en2016/07/26 11:37:381042---0.677412---0.0000000.0000000.000000OtherYesOnline D (2016)
49NICT-2ja-en2016/08/05 17:50:251104---0.708808---0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
50NAISTja-en2016/08/09 16:14:051122---0.762712---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
51bjtu_nlpja-en2016/08/17 19:51:241168---0.690455---0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
52Kyoto-Uja-en2016/08/18 15:17:051182---0.756601---0.0000000.0000000.000000NMTNoEnsemble of 4 single-layer model (30k voc)
53Kyoto-Uja-en2016/08/19 01:31:011189---0.705700---0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
54TMUja-en2016/08/20 07:39:021222---0.710613---0.0000000.0000000.000000NMTNo2016 our proposed method to control output voice
55TMUja-en2016/08/20 14:31:481234---0.711542---0.0000000.0000000.000000NMTNo 6 ensemble
56Kyoto-Uja-en2016/08/20 15:07:471246---0.750802---0.0000000.0000000.000000NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
57NAISTja-en2016/08/20 15:33:121247---0.756956---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon 6 Ensemble
58NAISTja-en2016/08/27 00:05:381275---0.767661---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
59ORGANIZERja-en2016/11/16 10:29:511333---0.733483---0.0000000.0000000.000000NMTYesOnline D (2016/11/14)
60NICT-2ja-en2017/07/26 13:54:281476---0.747335---0.0000000.0000000.000000NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
61NICT-2ja-en2017/07/26 14:04:381480---0.741329---0.0000000.0000000.000000NMTNoNMT 6 Ensembles * Bi-directional Reranking
62NTTja-en2017/07/30 20:43:071616---0.764831---0.0000000.0000000.000000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
63CUNIja-en2017/07/31 22:30:521665---0.741699---0.0000000.0000000.000000NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
64NTTja-en2017/08/01 04:29:021681---0.768880---0.0000000.0000000.000000NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
65TMUja-en2017/08/01 11:10:491695---0.725284---0.0000000.0000000.000000NMTNoour baseline system in 2017
66TMUja-en2017/08/01 11:35:081703---0.741175---0.0000000.0000000.000000NMTNobaseline system with beam20
67TMUja-en2017/08/01 12:14:411712---0.735908---0.0000000.0000000.000000NMTNothe ensemble system of different dropout rate.
68Kyoto-Uja-en2017/08/01 13:42:251717---0.761403---0.0000000.0000000.000000NMTNoEnsemble of 4 BPE averaged parameters
69NTTja-en2017/08/01 14:50:511724---0.763248---0.0000000.0000000.000000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data)
70NTTja-en2017/08/01 15:53:151732---0.769430---0.0000000.0000000.000000NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data)
71Kyoto-Uja-en2017/08/01 16:38:211733---0.765464---0.0000000.0000000.000000NMTNoEnsemble of 4 BPE, averaged Coverage penalty
72ORGANIZERja-en2017/08/02 01:03:081736---0.767577---0.0000000.0000000.000000NMTNoGoogle's "Attention Is All You Need"
73TMUja-en2017/08/04 11:16:361750---0.744928---0.0000000.0000000.000000NMTNobeam_size: 10, ensemble of different dropout rates.
74ORGANIZERja-en2018/08/14 11:07:471901---0.764968----0.0000000.000000NMTNoNMT with Attention
75srcbja-en2018/08/26 11:09:502152---0.767194----0.0000000.000000NMTNoTransformer, average checkpoints.
76NICT-5ja-en2018/08/27 15:01:052174---0.765933----0.0000000.000000NMTNoTransformer vanilla model using 3M sentences.
77NICT-5ja-en2018/09/10 14:55:372273---0.774788----0.0000000.000000NMTNoMLNMT
78Osaka-Uja-en2018/09/15 23:05:122440---0.749825----0.0000000.000000NMTYesrewarding model
79TMUja-en2018/09/16 11:53:242461---0.757955----0.0000000.000000NMTNoBaseline-NMT ( Single )
80TMUja-en2018/09/16 12:04:362464---0.761450----0.0000000.000000NMTNoEnsemble of 6 Baseline-NMT
81TMUja-en2018/09/16 12:05:472465---0.757413----0.0000000.000000NMTNoGAN-NMT ( Single )
82TMUja-en2018/09/16 12:06:392466---0.759238----0.0000000.000000NMTNoReconstructor-NMT ( Single )
83Osaka-Uja-en2018/09/16 13:11:482472---0.665391----0.0000000.000000SMTNopreordering with neural network
84srcbja-en2018/09/16 14:51:472474---0.777896----0.0000000.000000NMTNoTransformer with relative position, ensemble of 3 models.
85TMUja-en2018/09/16 17:02:152483---0.759790----0.0000000.000000NMTNoEnsemble of 6 NMT ( 2 Baseline + 2 Reconstructor + 2 GAN )
86NICT-5ja-en2019/07/16 17:11:382718---0.772422------NMTNoRSNMT 6 layer with distillation
87srcbja-en2019/07/25 11:52:482919---0.777801------NMTNoTransformer (Big) with relative position, average checkpoints.
88ykkdja-en2019/07/26 12:26:032989---0.769061------NMTNoFully Character-level,6 bi-LSTM (512*2) for Encoder, 6 LSTM for Decoder. Middle dense layer. Beam w/4.Length norm set to 0.2
89NICT-5ja-en2019/07/26 18:02:473041---0.764672------NMTNoRSNMT 6 layer
90NICT-2ja-en2019/07/26 22:33:303085---0.756346------NMTNoTransformer, sigle model w/ long warm-up and self-training
91NICT-2ja-en2019/07/26 22:37:223086---0.760796------NMTNoTransformer, ensemble of 4 models w/ long warm-up and self-training
92KNU_Hyundaija-en2019/07/27 09:28:373173---0.774653------NMTNoTransformer Base, relative position, BT, r2l reranking, ensemble of 3 models
93srcbja-en2019/07/27 15:27:013205---0.778832------NMTNoTransformer (Big) with relative position, data augmentation, average checkpoints, Bayes re-ranking.
94NTTja-en2019/07/28 11:18:593225---0.773281------SMTNoASPEC first 1.5M +last 1.5M (fwd), 6 ensemble
95NTTja-en2019/07/28 15:11:043233---0.770096------NMTYesParaCrawl + (ASPEC first 1.5M + Synthetic 1.5M) * 2 oversampling, fine-tune ASPEC, SINGLE MODEL
96AISTAIja-en2019/08/01 10:44:493260---0.760656------NMTNoTransformer (big), 1.5M sentences, train_steps=300000, Averaged the last 10 ckpts, by Tensor2Tensor.
97AISTAIja-en2019/08/31 21:28:133361---0.769105------NMTNoTransformer, 1.5M sentences, relative position, ensemble of 4 models, by OpenNMT-py.
98NICT-5ja-en2021/03/18 23:22:534574---0.768617------NMTNoMy NMT implementation. Beam size 8. LP 0.6

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1ORGANIZERja-en2016/07/26 11:37:381042---0.564270---0.0000000.0000000.000000OtherYesOnline D (2016)
2NICT-2ja-en2016/08/05 17:50:251104---0.595930---0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
3NAISTja-en2016/08/09 16:14:051122---0.587450---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
4bjtu_nlpja-en2016/08/17 19:51:241168---0.505730---0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
5Kyoto-Uja-en2016/08/18 15:17:051182---0.558540---0.0000000.0000000.000000NMTNoEnsemble of 4 single-layer model (30k voc)
6Kyoto-Uja-en2016/08/19 01:31:011189---0.595240---0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
7TMUja-en2016/08/20 07:39:021222---0.565270---0.0000000.0000000.000000NMTNo2016 our proposed method to control output voice
8TMUja-en2016/08/20 14:31:481234---0.546880---0.0000000.0000000.000000NMTNo 6 ensemble
9Kyoto-Uja-en2016/08/20 15:07:471246---0.562650---0.0000000.0000000.000000NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
10NAISTja-en2016/08/20 15:33:121247---0.571360---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon 6 Ensemble
11NAISTja-en2016/08/27 00:05:381275---0.594150---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
12ORGANIZERja-en2016/11/16 10:29:511333---0.584390---0.0000000.0000000.000000NMTYesOnline D (2016/11/14)
13NICT-2ja-en2017/07/26 13:54:281476---0.574810---0.0000000.0000000.000000NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
14NICT-2ja-en2017/07/26 14:04:381480---0.578150---0.0000000.0000000.000000NMTNoNMT 6 Ensembles * Bi-directional Reranking
15NTTja-en2017/07/30 20:43:071616---0.597620---0.0000000.0000000.000000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
16CUNIja-en2017/07/31 22:30:521665---0.583780---0.0000000.0000000.000000NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
17NTTja-en2017/08/01 04:29:021681---0.597860---0.0000000.0000000.000000NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
18TMUja-en2017/08/01 11:10:491695---0.585710---0.0000000.0000000.000000NMTNoour baseline system in 2017
19TMUja-en2017/08/01 11:35:081703---0.595260---0.0000000.0000000.000000NMTNobaseline system with beam20
20TMUja-en2017/08/01 12:14:411712---0.588360---0.0000000.0000000.000000NMTNothe ensemble system of different dropout rate.
21Kyoto-Uja-en2017/08/01 13:42:251717---0.585540---0.0000000.0000000.000000NMTNoEnsemble of 4 BPE averaged parameters
22NTTja-en2017/08/01 14:50:511724---0.597470---0.0000000.0000000.000000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data)
23NTTja-en2017/08/01 15:53:151732---0.599920---0.0000000.0000000.000000NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data)
24Kyoto-Uja-en2017/08/01 16:38:211733---0.591160---0.0000000.0000000.000000NMTNoEnsemble of 4 BPE, averaged Coverage penalty
25ORGANIZERja-en2017/08/02 01:03:081736---0.595580---0.0000000.0000000.000000NMTNoGoogle's "Attention Is All You Need"
26TMUja-en2017/08/04 11:16:361750---0.596360---0.0000000.0000000.000000NMTNobeam_size: 10, ensemble of different dropout rates.
27ORGANIZERja-en2018/08/14 11:07:471901---0.595370----0.0000000.000000NMTNoNMT with Attention
28srcbja-en2018/08/26 11:09:502152---0.605130----0.0000000.000000NMTNoTransformer, average checkpoints.
29NICT-5ja-en2018/08/27 15:01:052174---0.608070----0.0000000.000000NMTNoTransformer vanilla model using 3M sentences.
30NICT-5ja-en2018/09/10 14:55:372273---0.612060----0.0000000.000000NMTNoMLNMT
31Osaka-Uja-en2018/09/15 23:05:122440---0.588290----0.0000000.000000NMTYesrewarding model
32TMUja-en2018/09/16 11:53:242461---0.596590----0.0000000.000000NMTNoBaseline-NMT ( Single )
33TMUja-en2018/09/16 12:04:362464---0.600730----0.0000000.000000NMTNoEnsemble of 6 Baseline-NMT
34TMUja-en2018/09/16 12:05:472465---0.595850----0.0000000.000000NMTNoGAN-NMT ( Single )
35TMUja-en2018/09/16 12:06:392466---0.599110----0.0000000.000000NMTNoReconstructor-NMT ( Single )
36Osaka-Uja-en2018/09/16 13:11:482472---0.571400----0.0000000.000000SMTNopreordering with neural network
37srcbja-en2018/09/16 14:51:472474---0.619390----0.0000000.000000NMTNoTransformer with relative position, ensemble of 3 models.
38TMUja-en2018/09/16 17:02:152483---0.598770----0.0000000.000000NMTNoEnsemble of 6 NMT ( 2 Baseline + 2 Reconstructor + 2 GAN )
39NICT-5ja-en2019/07/16 17:11:382718---0.619040------NMTNoRSNMT 6 layer with distillation
40srcbja-en2019/07/25 11:52:482919---0.628070------NMTNoTransformer (Big) with relative position, average checkpoints.
41ykkdja-en2019/07/26 12:26:032989---0.619640------NMTNoFully Character-level,6 bi-LSTM (512*2) for Encoder, 6 LSTM for Decoder. Middle dense layer. Beam w/4.Length norm set to 0.2
42NICT-5ja-en2019/07/26 18:02:473041---0.608450------NMTNoRSNMT 6 layer
43NICT-2ja-en2019/07/26 22:33:303085---0.602770------NMTNoTransformer, sigle model w/ long warm-up and self-training
44NICT-2ja-en2019/07/26 22:37:223086---0.606190------NMTNoTransformer, ensemble of 4 models w/ long warm-up and self-training
45KNU_Hyundaija-en2019/07/27 09:28:373173---0.622070------NMTNoTransformer Base, relative position, BT, r2l reranking, ensemble of 3 models
46srcbja-en2019/07/27 15:27:013205---0.630150------NMTNoTransformer (Big) with relative position, data augmentation, average checkpoints, Bayes re-ranking.
47NTTja-en2019/07/28 11:18:593225---0.626880------SMTNoASPEC first 1.5M +last 1.5M (fwd), 6 ensemble
48NTTja-en2019/07/28 15:11:043233---0.626260------NMTYesParaCrawl + (ASPEC first 1.5M + Synthetic 1.5M) * 2 oversampling, fine-tune ASPEC, SINGLE MODEL
49AISTAIja-en2019/08/01 10:44:493260---0.620640------NMTNoTransformer (big), 1.5M sentences, train_steps=300000, Averaged the last 10 ckpts, by Tensor2Tensor.
50AISTAIja-en2019/08/31 21:28:133361---0.626300------NMTNoTransformer, 1.5M sentences, relative position, ensemble of 4 models, by OpenNMT-py.
51NICT-5ja-en2021/03/18 23:22:534574---0.601460------NMTNoMy NMT implementation. Beam size 8. LP 0.6
52ORGANIZERja-en2014/07/11 19:45:3220.0000000.0000000.0000000.5888800.0000000.0000000.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
53ORGANIZERja-en2014/07/11 19:49:5760.0000000.0000000.0000000.5909500.0000000.0000000.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
54ORGANIZERja-en2014/07/11 19:59:5590.0000000.0000000.0000000.5934100.0000000.0000000.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2014)
55ORGANIZERja-en2014/07/18 11:08:13350.0000000.0000000.0000000.5641700.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline D (2014)
56NAISTja-en2014/07/19 01:04:48460.0000000.0000000.0000000.6041800.0000000.0000000.0000000.0000000.0000000.000000SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
57ORGANIZERja-en2014/07/21 11:53:48760.0000000.0000000.0000000.5616200.0000000.0000000.0000000.0000000.0000000.000000OtherYesRBMT E
58ORGANIZERja-en2014/07/21 11:57:08790.0000000.0000000.0000000.5568400.0000000.0000000.0000000.0000000.0000000.000000OtherYesRBMT F
59ORGANIZERja-en2014/07/22 11:22:40870.0000000.0000000.0000000.4664800.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline C (2014)
60ORGANIZERja-en2014/07/23 14:52:31960.0000000.0000000.0000000.5516900.0000000.0000000.0000000.0000000.0000000.000000OtherYesRBMT D (2014)
61EIWAja-en2014/07/30 16:07:141160.0000000.0000000.0000000.5765400.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
62NAISTja-en2014/07/31 11:40:531190.0000000.0000000.0000000.6034900.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
63NAISTja-en2014/08/01 17:35:161250.0000000.0000000.0000000.6027800.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
64Kyoto-Uja-en2014/08/19 10:24:061360.0000000.0000000.0000000.5939700.0000000.0000000.0000000.0000000.0000000.000000EBMTNoOur baseline system using 3M parallel sentences.
65Senseja-en2014/08/23 05:34:031640.0000000.0000000.0000000.5875200.0000000.0000000.0000000.0000000.0000000.000000SMTNoParaphrase max10
66TOSHIBAja-en2014/08/29 18:47:442400.0000000.0000000.0000000.5529800.0000000.0000000.0000000.0000000.0000000.000000RBMTYesRBMT system
67TOSHIBAja-en2014/08/29 18:48:242410.0000000.0000000.0000000.5517400.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
68Kyoto-Uja-en2014/08/31 23:36:502560.0000000.0000000.0000000.5936600.0000000.0000000.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
69Kyoto-Uja-en2014/09/01 10:27:542620.0000000.0000000.0000000.5884800.0000000.0000000.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
70NIIja-en2014/09/02 11:42:012710.0000000.0000000.0000000.5828000.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur Baseline
71NIIja-en2014/09/02 11:42:532720.0000000.0000000.0000000.5740300.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur Baseline with Preordering
72TMUja-en2014/09/07 23:28:043000.0000000.0000000.0000000.5614500.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system with preordering method
73TMUja-en2014/09/07 23:32:493010.0000000.0000000.0000000.5783600.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system with another preordering method
74TMUja-en2014/09/09 19:14:423070.0000000.0000000.0000000.5805000.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system
75NICTja-en2015/07/16 13:27:584880.0000000.0000000.0000000.5875300.0000000.0000000.0000000.0000000.0000000.000000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
76NICTja-en2015/07/17 08:51:454890.0000000.0000000.0000000.5629200.0000000.0000000.0000000.0000000.0000000.000000SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
77NICTja-en2015/07/17 11:02:104920.0000000.0000000.0000000.5765100.0000000.0000000.0000000.0000000.0000000.000000SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
78TOSHIBAja-en2015/07/23 15:00:125060.0000000.0000000.0000000.6047600.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
79TOSHIBAja-en2015/07/28 16:44:275290.0000000.0000000.0000000.5978300.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
80Senseja-en2015/07/28 22:23:435300.0000000.0000000.0000000.5646100.0000000.0000000.0000000.0000000.0000000.000000SMTNoBaseline-2015
81TMUja-en2015/08/04 16:32:205780.0000000.0000000.0000000.5909800.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur PBSMT baseline (2015)
82NAISTja-en2015/08/14 17:46:436550.0000000.0000000.0000000.6094300.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
83Senseja-en2015/08/18 21:54:397020.0000000.0000000.0000000.5792100.0000000.0000000.0000000.0000000.0000000.000000SMTYesPassive JSTx1
84Senseja-en2015/08/18 21:58:087080.0000000.0000000.0000000.5815400.0000000.0000000.0000000.0000000.0000000.000000SMTYesPervasive JSTx1
85NAISTja-en2015/08/24 23:53:537570.0000000.0000000.0000000.6066000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
86NAISTja-en2015/08/25 13:02:457660.0000000.0000000.0000000.5998000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System with Parser Self Training
87NAISTja-en2015/08/25 13:03:487670.0000000.0000000.0000000.6000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System Baseline
88ORGANIZERja-en2015/08/25 18:57:257750.0000000.0000000.0000000.5622700.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline D (2015)
89Kyoto-Uja-en2015/08/27 14:40:327960.0000000.0000000.0000000.5964300.0000000.0000000.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
90Senseja-en2015/08/28 19:25:258220.0000000.0000000.0000000.5920400.0000000.0000000.0000000.0000000.0000000.000000SMTNoBaseline-2015 (train1 only)
91Senseja-en2015/08/29 04:32:368240.0000000.0000000.0000000.5792800.0000000.0000000.0000000.0000000.0000000.000000SMTNoBaseline-2015 (train123)
92Kyoto-Uja-en2015/08/30 13:02:188290.0000000.0000000.0000000.6032100.0000000.0000000.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
93TMUja-en2015/09/01 05:46:508470.0000000.0000000.0000000.5704300.0000000.0000000.0000000.0000000.0000000.000000SMTNoPBSMT with dependency based phrase segmentation
94Senseja-en2015/09/01 17:42:288600.0000000.0000000.0000000.5797900.0000000.0000000.0000000.0000000.0000000.000000SMTYesPassive JSTx1
95Senseja-en2015/09/01 17:42:588610.0000000.0000000.0000000.5823700.0000000.0000000.0000000.0000000.0000000.000000SMTYesPervasive JSTx1
96ORGANIZERja-en2015/09/10 13:41:038770.0000000.0000000.0000000.5934100.0000000.0000000.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2015)
97ORGANIZERja-en2015/09/10 14:38:138870.0000000.0000000.0000000.5516900.0000000.0000000.0000000.0000000.0000000.000000OtherYesRBMT D (2015)
98ORGANIZERja-en2015/09/11 10:54:338920.0000000.0000000.0000000.4533700.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline C (2015)

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NTTja-en2019/07/28 11:18:59322514.000SMTNoASPEC first 1.5M +last 1.5M (fwd), 6 ensemble
2KNU_Hyundaija-en2019/07/27 09:28:37317311.750NMTNoTransformer Base, relative position, BT, r2l reranking, ensemble of 3 models
3NICT-2ja-en2019/07/26 22:37:2230869.500NMTNoTransformer, ensemble of 4 models w/ long warm-up and self-training
4srcbja-en2019/07/27 15:27:0132056.500NMTNoTransformer (Big) with relative position, data augmentation, average checkpoints, Bayes re-ranking.

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NICT-5ja-en2018/08/27 15:01:05217415.750NMTNoTransformer vanilla model using 3M sentences.
2NICT-5ja-en2018/09/10 14:55:37227311.500NMTNoMLNMT
3srcbja-en2018/09/16 14:51:4724745.750NMTNoTransformer with relative position, ensemble of 3 models.
4TMUja-en2018/09/16 12:04:362464-20.000NMTNoEnsemble of 6 Baseline-NMT
5Osaka-Uja-en2018/09/15 23:05:122440-37.000NMTYesrewarding model
6Osaka-Uja-en2018/09/16 13:11:482472-95.750SMTNopreordering with neural network

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-Uja-en2017/08/01 13:42:25171777.750NMTNoEnsemble of 4 BPE averaged parameters
2NTTja-en2017/08/01 04:29:02168177.250NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
3ORGANIZERja-en2017/08/02 01:03:08173675.250NMTNoGoogle's "Attention Is All You Need"
4NTTja-en2017/07/30 20:43:07161675.000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
5Kyoto-Uja-en2017/08/01 16:38:21173374.500NMTNoEnsemble of 4 BPE, averaged Coverage penalty
6NICT-2ja-en2017/07/26 14:04:38148069.750NMTNoNMT 6 Ensembles * Bi-directional Reranking
7NICT-2ja-en2017/07/26 13:54:28147668.750NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
8CUNIja-en2017/07/31 22:30:52166566.000NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
9TMUja-en2017/08/01 11:35:08170361.000NMTNobaseline system with beam20
10TMUja-en2017/08/01 11:10:49169556.750NMTNoour baseline system in 2017

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1ORGANIZERja-en2016/11/16 10:29:51133363.000NMTYesOnline D (2016/11/14)
2NAISTja-en2016/08/09 16:14:05112248.250SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
3NAISTja-en2016/08/20 15:33:12124747.500SMTNoNeural MT w/ Lexicon 6 Ensemble
4Kyoto-Uja-en2016/08/20 15:07:47124647.000NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
5Kyoto-Uja-en2016/08/18 15:17:05118244.250NMTNoEnsemble of 4 single-layer model (30k voc)
6ORGANIZERja-en2016/07/26 11:37:38104228.000OtherYesOnline D (2016)
7TMUja-en2016/08/20 14:31:48123425.000NMTNo 6 ensemble
8bjtu_nlpja-en2016/08/17 19:51:24116819.250NMTNoRNN Encoder-Decoder with attention mechanism, single model
9TMUja-en2016/08/20 07:39:02122216.000NMTNo2016 our proposed method to control output voice
10NICT-2ja-en2016/08/05 17:50:251104UnderwaySMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NAISTja-en2015/08/14 17:46:4365535.500SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
2Kyoto-Uja-en2015/08/30 13:02:1882932.500EBMTNoKyotoEBMT system with bilingual RNNLM reranking
3TOSHIBAja-en2015/07/28 16:44:2752925.000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
4TOSHIBAja-en2015/07/23 15:00:1250621.250SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
5ORGANIZERja-en2015/09/10 14:38:1388716.750OtherYesRBMT D (2015)
6Kyoto-Uja-en2015/08/27 14:40:3279616.500EBMTNoKyotoEBMT system without reranking
7NICTja-en2015/07/16 13:27:5848816.000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
8NAISTja-en2015/08/25 13:02:4576611.750SMTNoTravatar System with Parser Self Training
9ORGANIZERja-en2015/09/10 13:41:038777.000SMTNoString-to-Tree SMT (2015)
10NICTja-en2015/07/17 11:02:104926.500SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
11ORGANIZERja-en2015/08/25 18:57:257750.250OtherYesOnline D (2015)
12Senseja-en2015/09/01 17:42:28860-7.750SMTYesPassive JSTx1
13Senseja-en2015/09/01 17:42:58861-12.750SMTYesPervasive JSTx1
14TMUja-en2015/09/01 05:46:50847-25.500SMTNoPBSMT with dependency based phrase segmentation

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NAISTja-en2014/07/19 01:04:484640.500SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
2NAISTja-en2014/07/31 11:40:5311937.500SMTNoTravatar-based Forest-to-String SMT System
3ORGANIZERja-en2014/07/11 19:59:55925.500SMTNoString-to-Tree SMT (2014)
4Kyoto-Uja-en2014/09/01 10:27:5426225.000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
5TOSHIBAja-en2014/08/29 18:48:2424123.250SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
6ORGANIZERja-en2014/07/23 14:52:319623.000OtherYesRBMT D (2014)
7EIWAja-en2014/07/30 16:07:1411622.500SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
8Kyoto-Uja-en2014/08/31 23:36:5025621.250EBMTNoOur new baseline system after several modifications.
9TOSHIBAja-en2014/08/29 18:47:4424020.250RBMTYesRBMT system
10ORGANIZERja-en2014/07/18 11:08:133513.750OtherYesOnline D (2014)
11ORGANIZERja-en2014/07/11 19:45:3227.750SMTNoHierarchical Phrase-based SMT (2014)
12Senseja-en2014/08/23 05:34:031641.250SMTNoParaphrase max10
13NIIja-en2014/09/02 11:42:01271-5.750SMTNoOur Baseline
14NIIja-en2014/09/02 11:42:53272-14.250SMTNoOur Baseline with Preordering
15TMUja-en2014/09/07 23:32:49301-17.000SMTNoOur baseline system with another preordering method
16TMUja-en2014/09/07 23:28:04300-17.250SMTNoOur baseline system with preordering method

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02