NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1srcbja-en2019/07/27 15:27:013205---30.92------NMTNoTransformer (Big) with relative position, data augmentation, average checkpoints, Bayes re-ranking.
2KNU_Hyundaija-en2019/07/27 09:28:373173---30.88------NMTNoTransformer Base, relative position, BT, r2l reranking, ensemble of 3 models
3srcbja-en2019/07/25 11:52:482919---30.62------NMTNoTransformer (Big) with relative position, average checkpoints.
4srcbja-en2018/09/16 14:51:472474---30.59---- 0.00 0.00NMTNoTransformer with relative position, ensemble of 3 models.
5NTTja-en2019/07/28 11:18:593225---30.56------SMTNoASPEC first 1.5M +last 1.5M (fwd), 6 ensemble
6NTTja-en2019/07/28 15:11:043233---30.28------NMTYesParaCrawl + (ASPEC first 1.5M + Synthetic 1.5M) * 2 oversampling, fine-tune ASPEC, SINGLE MODEL
7AISTAIja-en2019/08/31 21:28:133361---29.71------NMTNoTransformer, 1.5M sentences, relative position, ensemble of 4 models, by OpenNMT-py.
8NICT-5ja-en2018/09/10 14:55:372273---29.65---- 0.00 0.00NMTNoMLNMT
9NICT-2ja-en2019/07/26 22:37:223086---29.40------NMTNoTransformer, ensemble of 4 models w/ long warm-up and self-training
10NICT-5ja-en2019/07/16 17:11:382718---29.12------NMTNoRSNMT 6 layer with distillation
11AISTAIja-en2019/08/01 10:44:493260---29.01------NMTNoTransformer (big), 1.5M sentences, train_steps=300000, Averaged the last 10 ckpts, by Tensor2Tensor.
12NICT-5ja-en2018/08/27 15:01:052174---28.63---- 0.00 0.00NMTNoTransformer vanilla model using 3M sentences.
13NICT-2ja-en2019/07/26 22:33:303085---28.61------NMTNoTransformer, sigle model w/ long warm-up and self-training
14srcbja-en2018/08/26 11:09:502152---28.46---- 0.00 0.00NMTNoTransformer, average checkpoints.
15NTTja-en2017/08/01 04:29:021681---28.36--- 0.00 0.00 0.00NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
16NICT-5ja-en2021/03/18 23:22:534574---28.35------NMTNoMy NMT implementation. Beam size 8. LP 0.6
17NTTja-en2017/08/01 15:53:151732---28.15--- 0.00 0.00 0.00NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data)
18ORGANIZERja-en2017/08/02 01:03:081736---28.06--- 0.00 0.00 0.00NMTNoGoogle's "Attention Is All You Need"
19Kyoto-Uja-en2017/08/01 16:38:211733---27.66--- 0.00 0.00 0.00NMTNoEnsemble of 4 BPE, averaged Coverage penalty
20ykkdja-en2019/07/26 12:26:032989---27.63------NMTNoFully Character-level,6 bi-LSTM (512*2) for Encoder, 6 LSTM for Decoder. Middle dense layer. Beam w/4.Length norm set to 0.2
21NTTja-en2017/08/01 14:50:511724---27.62--- 0.00 0.00 0.00NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data)
22NAISTja-en2016/08/27 00:05:381275---27.55--- 0.00 0.00 0.00SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
23Kyoto-Uja-en2017/08/01 13:42:251717---27.53--- 0.00 0.00 0.00NMTNoEnsemble of 4 BPE averaged parameters
24NTTja-en2017/07/30 20:43:071616---27.43--- 0.00 0.00 0.00NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
25NICT-5ja-en2019/07/26 18:02:473041---26.99------NMTNoRSNMT 6 layer
26ORGANIZERja-en2018/08/14 11:07:471901---26.91---- 0.00 0.00NMTNoNMT with Attention
27NICT-2ja-en2017/07/26 14:04:381480---26.76--- 0.00 0.00 0.00NMTNoNMT 6 Ensembles * Bi-directional Reranking
28NAISTja-en2016/08/09 16:14:051122---26.39--- 0.00 0.00 0.00SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
29Kyoto-Uja-en2016/08/18 15:17:051182---26.22--- 0.00 0.00 0.00NMTNoEnsemble of 4 single-layer model (30k voc)
30Osaka-Uja-en2018/09/15 23:05:122440---26.19---- 0.00 0.00NMTYesrewarding model
31NAISTja-en2016/08/20 15:33:121247---26.12--- 0.00 0.00 0.00SMTNoNeural MT w/ Lexicon 6 Ensemble
32TMUja-en2018/09/16 12:04:362464---25.85---- 0.00 0.00NMTNoEnsemble of 6 Baseline-NMT
33TMUja-en2018/09/16 17:02:152483---25.45---- 0.00 0.00NMTNoEnsemble of 6 NMT ( 2 Baseline + 2 Reconstructor + 2 GAN )
34NAISTja-en2015/08/14 17:46:43655---25.41-- 0.00 0.00 0.00 0.00SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
35TMUja-en2018/09/16 12:05:472465---25.17---- 0.00 0.00NMTNoGAN-NMT ( Single )
36TMUja-en2018/09/16 12:06:392466---24.98---- 0.00 0.00NMTNoReconstructor-NMT ( Single )
37TMUja-en2018/09/16 11:53:242461---24.94---- 0.00 0.00NMTNoBaseline-NMT ( Single )
38NICT-2ja-en2017/07/26 13:54:281476---24.79--- 0.00 0.00 0.00NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
39NAISTja-en2015/08/24 23:53:53757---24.77-- 0.00 0.00 0.00 0.00SMTNoTravatar System with NeuralMT Reranking
40Kyoto-Uja-en2016/08/20 15:07:471246---24.71--- 0.00 0.00 0.00NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
41TMUja-en2017/08/04 11:16:361750---24.55--- 0.00 0.00 0.00NMTNobeam_size: 10, ensemble of different dropout rates.
42NAISTja-en2014/07/19 01:04:4846---23.82-- 0.00 0.00 0.00 0.00SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
43NAISTja-en2014/08/01 17:35:16125---23.47-- 0.00 0.00 0.00 0.00SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
44CUNIja-en2017/07/31 22:30:521665---23.43--- 0.00 0.00 0.00NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
45NAISTja-en2014/07/31 11:40:53119---23.29-- 0.00 0.00 0.00 0.00SMTNoTravatar-based Forest-to-String SMT System
46TMUja-en2017/08/01 11:35:081703---23.03--- 0.00 0.00 0.00NMTNobaseline system with beam20
47TOSHIBAja-en2015/07/23 15:00:12506---23.00-- 0.00 0.00 0.00 0.00SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
48TOSHIBAja-en2015/07/28 16:44:27529---22.89-- 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
49Kyoto-Uja-en2015/08/30 13:02:18829---22.89-- 0.00 0.00 0.00 0.00EBMTNoKyotoEBMT system with bilingual RNNLM reranking
50TMUja-en2017/08/01 12:14:411712---22.87--- 0.00 0.00 0.00NMTNothe ensemble system of different dropout rate.
51NAISTja-en2015/08/25 13:02:45766---22.62-- 0.00 0.00 0.00 0.00SMTNoTravatar System with Parser Self Training
52NAISTja-en2015/08/25 13:03:48767---22.16-- 0.00 0.00 0.00 0.00SMTNoTravatar System Baseline
53ORGANIZERja-en2016/11/16 10:29:511333---22.04--- 0.00 0.00 0.00NMTYesOnline D (2016/11/14)
54NICT-2ja-en2016/08/05 17:50:251104---21.54--- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
55Kyoto-Uja-en2015/08/27 14:40:32796---21.31-- 0.00 0.00 0.00 0.00EBMTNoKyotoEBMT system without reranking
56Kyoto-Uja-en2016/08/19 01:31:011189---21.22--- 0.00 0.00 0.00EBMTNoKyotoEBMT 2016 w/o reranking
57Kyoto-Uja-en2014/09/01 10:27:54262---21.07-- 0.00 0.00 0.00 0.00EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
58TMUja-en2017/08/01 11:10:491695---21.00--- 0.00 0.00 0.00NMTNoour baseline system in 2017
59TOSHIBAja-en2014/08/29 18:48:24241---20.61-- 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
60Kyoto-Uja-en2014/08/31 23:36:50256---20.60-- 0.00 0.00 0.00 0.00EBMTNoOur new baseline system after several modifications.
61ORGANIZERja-en2014/07/11 19:59:559---20.36-- 0.00 0.00 0.00 0.00SMTNoString-to-Tree SMT (2014)
62ORGANIZERja-en2015/09/10 13:41:03877---20.36-- 0.00 0.00 0.00 0.00SMTNoString-to-Tree SMT (2015)
63Kyoto-Uja-en2014/08/19 10:24:06136---20.02-- 0.00 0.00 0.00 0.00EBMTNoOur baseline system using 3M parallel sentences.
64EIWAja-en2014/07/30 16:07:14116---19.86-- 0.00 0.00 0.00 0.00SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
65NICTja-en2015/07/16 13:27:58488---18.98-- 0.00 0.00 0.00 0.00SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
66NICTja-en2015/07/17 11:02:10492---18.96-- 0.00 0.00 0.00 0.00SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
67Senseja-en2014/08/23 05:34:03164---18.82-- 0.00 0.00 0.00 0.00SMTNoParaphrase max10
68ORGANIZERja-en2014/07/11 19:45:322---18.72-- 0.00 0.00 0.00 0.00SMTNoHierarchical Phrase-based SMT (2014)
69ORGANIZERja-en2014/07/11 19:49:576---18.45-- 0.00 0.00 0.00 0.00SMTNoPhrase-based SMT
70TMUja-en2016/08/20 14:31:481234---18.45--- 0.00 0.00 0.00NMTNo 6 ensemble
71bjtu_nlpja-en2016/08/17 19:51:241168---18.34--- 0.00 0.00 0.00NMTNoRNN Encoder-Decoder with attention mechanism, single model
72TMUja-en2015/08/04 16:32:20578---18.32-- 0.00 0.00 0.00 0.00SMTNoOur PBSMT baseline (2015)
73TMUja-en2016/08/20 07:39:021222---18.29--- 0.00 0.00 0.00NMTNo2016 our proposed method to control output voice
74Senseja-en2015/08/28 19:25:25822---18.20-- 0.00 0.00 0.00 0.00SMTNoBaseline-2015 (train1 only)
75NICTja-en2015/07/17 08:51:45489---18.09-- 0.00 0.00 0.00 0.00SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
76Senseja-en2015/08/29 04:32:36824---18.09-- 0.00 0.00 0.00 0.00SMTNoBaseline-2015 (train123)
77NIIja-en2014/09/02 11:42:01271---17.47-- 0.00 0.00 0.00 0.00SMTNoOur Baseline
78Senseja-en2015/07/28 22:23:43530---17.04-- 0.00 0.00 0.00 0.00SMTNoBaseline-2015
79NIIja-en2014/09/02 11:42:53272---17.01-- 0.00 0.00 0.00 0.00SMTNoOur Baseline with Preordering
80Senseja-en2015/09/01 17:42:28860---16.96-- 0.00 0.00 0.00 0.00SMTYesPassive JSTx1
81ORGANIZERja-en2016/07/26 11:37:381042---16.91--- 0.00 0.00 0.00OtherYesOnline D (2016)
82ORGANIZERja-en2015/08/25 18:57:25775---16.85-- 0.00 0.00 0.00 0.00OtherYesOnline D (2015)
83Senseja-en2015/08/18 21:54:39702---16.72-- 0.00 0.00 0.00 0.00SMTYesPassive JSTx1
84Senseja-en2015/09/01 17:42:58861---16.61-- 0.00 0.00 0.00 0.00SMTYesPervasive JSTx1
85Senseja-en2015/08/18 21:58:08708---16.49-- 0.00 0.00 0.00 0.00SMTYesPervasive JSTx1
86TMUja-en2014/09/07 23:32:49301---15.95-- 0.00 0.00 0.00 0.00SMTNoOur baseline system with another preordering method
87TMUja-en2015/09/01 05:46:50847---15.85-- 0.00 0.00 0.00 0.00SMTNoPBSMT with dependency based phrase segmentation
88TOSHIBAja-en2014/08/29 18:47:44240---15.69-- 0.00 0.00 0.00 0.00RBMTYesRBMT system
89TMUja-en2014/09/07 23:28:04300---15.55-- 0.00 0.00 0.00 0.00SMTNoOur baseline system with preordering method
90TMUja-en2014/09/09 19:14:42307---15.40-- 0.00 0.00 0.00 0.00SMTNoOur baseline system
91ORGANIZERja-en2014/07/23 14:52:3196---15.29-- 0.00 0.00 0.00 0.00OtherYesRBMT D (2014)
92ORGANIZERja-en2015/09/10 14:38:13887---15.29-- 0.00 0.00 0.00 0.00OtherYesRBMT D (2015)
93ORGANIZERja-en2014/07/18 11:08:1335---15.08-- 0.00 0.00 0.00 0.00OtherYesOnline D (2014)
94ORGANIZERja-en2014/07/21 11:53:4876---14.82-- 0.00 0.00 0.00 0.00OtherYesRBMT E
95Osaka-Uja-en2018/09/16 13:11:482472---13.97---- 0.00 0.00SMTNopreordering with neural network
96ORGANIZERja-en2014/07/21 11:57:0879---13.86-- 0.00 0.00 0.00 0.00OtherYesRBMT F
97ORGANIZERja-en2014/07/22 11:22:4087---10.64-- 0.00 0.00 0.00 0.00OtherYesOnline C (2014)
98ORGANIZERja-en2015/09/11 10:54:33892---10.29-- 0.00 0.00 0.00 0.00OtherYesOnline C (2015)

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1srcbja-en2019/07/27 15:27:013205---0.778832------NMTNoTransformer (Big) with relative position, data augmentation, average checkpoints, Bayes re-ranking.
2srcbja-en2018/09/16 14:51:472474---0.777896----0.0000000.000000NMTNoTransformer with relative position, ensemble of 3 models.
3srcbja-en2019/07/25 11:52:482919---0.777801------NMTNoTransformer (Big) with relative position, average checkpoints.
4NICT-5ja-en2018/09/10 14:55:372273---0.774788----0.0000000.000000NMTNoMLNMT
5KNU_Hyundaija-en2019/07/27 09:28:373173---0.774653------NMTNoTransformer Base, relative position, BT, r2l reranking, ensemble of 3 models
6NTTja-en2019/07/28 11:18:593225---0.773281------SMTNoASPEC first 1.5M +last 1.5M (fwd), 6 ensemble
7NICT-5ja-en2019/07/16 17:11:382718---0.772422------NMTNoRSNMT 6 layer with distillation
8NTTja-en2019/07/28 15:11:043233---0.770096------NMTYesParaCrawl + (ASPEC first 1.5M + Synthetic 1.5M) * 2 oversampling, fine-tune ASPEC, SINGLE MODEL
9NTTja-en2017/08/01 15:53:151732---0.769430---0.0000000.0000000.000000NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data)
10AISTAIja-en2019/08/31 21:28:133361---0.769105------NMTNoTransformer, 1.5M sentences, relative position, ensemble of 4 models, by OpenNMT-py.
11ykkdja-en2019/07/26 12:26:032989---0.769061------NMTNoFully Character-level,6 bi-LSTM (512*2) for Encoder, 6 LSTM for Decoder. Middle dense layer. Beam w/4.Length norm set to 0.2
12NTTja-en2017/08/01 04:29:021681---0.768880---0.0000000.0000000.000000NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
13NICT-5ja-en2021/03/18 23:22:534574---0.768617------NMTNoMy NMT implementation. Beam size 8. LP 0.6
14NAISTja-en2016/08/27 00:05:381275---0.767661---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
15ORGANIZERja-en2017/08/02 01:03:081736---0.767577---0.0000000.0000000.000000NMTNoGoogle's "Attention Is All You Need"
16srcbja-en2018/08/26 11:09:502152---0.767194----0.0000000.000000NMTNoTransformer, average checkpoints.
17NICT-5ja-en2018/08/27 15:01:052174---0.765933----0.0000000.000000NMTNoTransformer vanilla model using 3M sentences.
18Kyoto-Uja-en2017/08/01 16:38:211733---0.765464---0.0000000.0000000.000000NMTNoEnsemble of 4 BPE, averaged Coverage penalty
19ORGANIZERja-en2018/08/14 11:07:471901---0.764968----0.0000000.000000NMTNoNMT with Attention
20NTTja-en2017/07/30 20:43:071616---0.764831---0.0000000.0000000.000000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
21NICT-5ja-en2019/07/26 18:02:473041---0.764672------NMTNoRSNMT 6 layer
22NTTja-en2017/08/01 14:50:511724---0.763248---0.0000000.0000000.000000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data)
23NAISTja-en2016/08/09 16:14:051122---0.762712---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
24TMUja-en2018/09/16 12:04:362464---0.761450----0.0000000.000000NMTNoEnsemble of 6 Baseline-NMT
25Kyoto-Uja-en2017/08/01 13:42:251717---0.761403---0.0000000.0000000.000000NMTNoEnsemble of 4 BPE averaged parameters
26NICT-2ja-en2019/07/26 22:37:223086---0.760796------NMTNoTransformer, ensemble of 4 models w/ long warm-up and self-training
27AISTAIja-en2019/08/01 10:44:493260---0.760656------NMTNoTransformer (big), 1.5M sentences, train_steps=300000, Averaged the last 10 ckpts, by Tensor2Tensor.
28TMUja-en2018/09/16 17:02:152483---0.759790----0.0000000.000000NMTNoEnsemble of 6 NMT ( 2 Baseline + 2 Reconstructor + 2 GAN )
29TMUja-en2018/09/16 12:06:392466---0.759238----0.0000000.000000NMTNoReconstructor-NMT ( Single )
30TMUja-en2018/09/16 11:53:242461---0.757955----0.0000000.000000NMTNoBaseline-NMT ( Single )
31TMUja-en2018/09/16 12:05:472465---0.757413----0.0000000.000000NMTNoGAN-NMT ( Single )
32NAISTja-en2016/08/20 15:33:121247---0.756956---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon 6 Ensemble
33Kyoto-Uja-en2016/08/18 15:17:051182---0.756601---0.0000000.0000000.000000NMTNoEnsemble of 4 single-layer model (30k voc)
34NICT-2ja-en2019/07/26 22:33:303085---0.756346------NMTNoTransformer, sigle model w/ long warm-up and self-training
35Kyoto-Uja-en2016/08/20 15:07:471246---0.750802---0.0000000.0000000.000000NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
36Osaka-Uja-en2018/09/15 23:05:122440---0.749825----0.0000000.000000NMTYesrewarding model
37NAISTja-en2015/08/14 17:46:43655---0.749573--0.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
38NICT-2ja-en2017/07/26 13:54:281476---0.747335---0.0000000.0000000.000000NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
39TMUja-en2017/08/04 11:16:361750---0.744928---0.0000000.0000000.000000NMTNobeam_size: 10, ensemble of different dropout rates.
40NAISTja-en2015/08/24 23:53:53757---0.743771--0.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
41CUNIja-en2017/07/31 22:30:521665---0.741699---0.0000000.0000000.000000NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
42NICT-2ja-en2017/07/26 14:04:381480---0.741329---0.0000000.0000000.000000NMTNoNMT 6 Ensembles * Bi-directional Reranking
43TMUja-en2017/08/01 11:35:081703---0.741175---0.0000000.0000000.000000NMTNobaseline system with beam20
44TMUja-en2017/08/01 12:14:411712---0.735908---0.0000000.0000000.000000NMTNothe ensemble system of different dropout rate.
45ORGANIZERja-en2016/11/16 10:29:511333---0.733483---0.0000000.0000000.000000NMTYesOnline D (2016/11/14)
46TMUja-en2017/08/01 11:10:491695---0.725284---0.0000000.0000000.000000NMTNoour baseline system in 2017
47Kyoto-Uja-en2015/08/30 13:02:18829---0.724555--0.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
48NAISTja-en2014/08/01 17:35:16125---0.723670--0.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
49NAISTja-en2014/07/31 11:40:53119---0.723541--0.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
50NAISTja-en2015/08/25 13:02:45766---0.722798--0.0000000.0000000.0000000.000000SMTNoTravatar System with Parser Self Training
51NAISTja-en2014/07/19 01:04:4846---0.722599--0.0000000.0000000.0000000.000000SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
52TOSHIBAja-en2015/07/28 16:44:27529---0.718540--0.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
53TOSHIBAja-en2015/07/23 15:00:12506---0.715795--0.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
54NAISTja-en2015/08/25 13:03:48767---0.713083--0.0000000.0000000.0000000.000000SMTNoTravatar System Baseline
55TMUja-en2016/08/20 14:31:481234---0.711542---0.0000000.0000000.000000NMTNo 6 ensemble
56TMUja-en2016/08/20 07:39:021222---0.710613---0.0000000.0000000.000000NMTNo2016 our proposed method to control output voice
57NICT-2ja-en2016/08/05 17:50:251104---0.708808---0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
58TOSHIBAja-en2014/08/29 18:48:24241---0.707936--0.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
59EIWAja-en2014/07/30 16:07:14116---0.706686--0.0000000.0000000.0000000.000000SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
60Kyoto-Uja-en2015/08/27 14:40:32796---0.706480--0.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
61Kyoto-Uja-en2016/08/19 01:31:011189---0.705700---0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
62Kyoto-Uja-en2014/08/31 23:36:50256---0.701154--0.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
63Kyoto-Uja-en2014/09/01 10:27:54262---0.698953--0.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
64bjtu_nlpja-en2016/08/17 19:51:241168---0.690455---0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
65Kyoto-Uja-en2014/08/19 10:24:06136---0.689829--0.0000000.0000000.0000000.000000EBMTNoOur baseline system using 3M parallel sentences.
66TOSHIBAja-en2014/08/29 18:47:44240---0.687122--0.0000000.0000000.0000000.000000RBMTYesRBMT system
67NICTja-en2015/07/17 11:02:10492---0.684485--0.0000000.0000000.0000000.000000SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
68ORGANIZERja-en2014/07/23 14:52:3196---0.683378--0.0000000.0000000.0000000.000000OtherYesRBMT D (2014)
69ORGANIZERja-en2015/09/10 14:38:13887---0.683378--0.0000000.0000000.0000000.000000OtherYesRBMT D (2015)
70ORGANIZERja-en2014/07/11 19:59:559---0.678253--0.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2014)
71ORGANIZERja-en2015/09/10 13:41:03877---0.678253--0.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2015)
72ORGANIZERja-en2016/07/26 11:37:381042---0.677412---0.0000000.0000000.000000OtherYesOnline D (2016)
73ORGANIZERja-en2015/08/25 18:57:25775---0.676609--0.0000000.0000000.0000000.000000OtherYesOnline D (2015)
74Osaka-Uja-en2018/09/16 13:11:482472---0.665391----0.0000000.000000SMTNopreordering with neural network
75ORGANIZERja-en2014/07/21 11:53:4876---0.663851--0.0000000.0000000.0000000.000000OtherYesRBMT E
76ORGANIZERja-en2014/07/21 11:57:0879---0.661387--0.0000000.0000000.0000000.000000OtherYesRBMT F
77NICTja-en2015/07/16 13:27:58488---0.659883--0.0000000.0000000.0000000.000000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
78ORGANIZERja-en2014/07/11 19:45:322---0.651066--0.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
79TMUja-en2014/09/07 23:32:49301---0.648879--0.0000000.0000000.0000000.000000SMTNoOur baseline system with another preordering method
80Senseja-en2014/08/23 05:34:03164---0.646204--0.0000000.0000000.0000000.000000SMTNoParaphrase max10
81ORGANIZERja-en2014/07/11 19:49:576---0.645137--0.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
82TMUja-en2014/09/07 23:28:04300---0.644698--0.0000000.0000000.0000000.000000SMTNoOur baseline system with preordering method
83ORGANIZERja-en2014/07/18 11:08:1335---0.643588--0.0000000.0000000.0000000.000000OtherYesOnline D (2014)
84TMUja-en2015/08/04 16:32:20578---0.641456--0.0000000.0000000.0000000.000000SMTNoOur PBSMT baseline (2015)
85NICTja-en2015/07/17 08:51:45489---0.639711--0.0000000.0000000.0000000.000000SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
86Senseja-en2015/08/29 04:32:36824---0.633073--0.0000000.0000000.0000000.000000SMTNoBaseline-2015 (train123)
87NIIja-en2014/09/02 11:42:01271---0.630825--0.0000000.0000000.0000000.000000SMTNoOur Baseline
88Senseja-en2015/08/28 19:25:25822---0.629066--0.0000000.0000000.0000000.000000SMTNoBaseline-2015 (train1 only)
89TMUja-en2015/09/01 05:46:50847---0.628897--0.0000000.0000000.0000000.000000SMTNoPBSMT with dependency based phrase segmentation
90Senseja-en2015/07/28 22:23:43530---0.627006--0.0000000.0000000.0000000.000000SMTNoBaseline-2015
91ORGANIZERja-en2014/07/22 11:22:4087---0.624827--0.0000000.0000000.0000000.000000OtherYesOnline C (2014)
92ORGANIZERja-en2015/09/11 10:54:33892---0.622564--0.0000000.0000000.0000000.000000OtherYesOnline C (2015)
93TMUja-en2014/09/09 19:14:42307---0.613119--0.0000000.0000000.0000000.000000SMTNoOur baseline system
94NIIja-en2014/09/02 11:42:53272---0.610833--0.0000000.0000000.0000000.000000SMTNoOur Baseline with Preordering
95Senseja-en2015/09/01 17:42:28860---0.610775--0.0000000.0000000.0000000.000000SMTYesPassive JSTx1
96Senseja-en2015/08/18 21:54:39702---0.609632--0.0000000.0000000.0000000.000000SMTYesPassive JSTx1
97Senseja-en2015/09/01 17:42:58861---0.609008--0.0000000.0000000.0000000.000000SMTYesPervasive JSTx1
98Senseja-en2015/08/18 21:58:08708---0.600806--0.0000000.0000000.0000000.000000SMTYesPervasive JSTx1

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1srcbja-en2019/07/27 15:27:013205---0.630150------NMTNoTransformer (Big) with relative position, data augmentation, average checkpoints, Bayes re-ranking.
2srcbja-en2019/07/25 11:52:482919---0.628070------NMTNoTransformer (Big) with relative position, average checkpoints.
3NTTja-en2019/07/28 11:18:593225---0.626880------SMTNoASPEC first 1.5M +last 1.5M (fwd), 6 ensemble
4AISTAIja-en2019/08/31 21:28:133361---0.626300------NMTNoTransformer, 1.5M sentences, relative position, ensemble of 4 models, by OpenNMT-py.
5NTTja-en2019/07/28 15:11:043233---0.626260------NMTYesParaCrawl + (ASPEC first 1.5M + Synthetic 1.5M) * 2 oversampling, fine-tune ASPEC, SINGLE MODEL
6KNU_Hyundaija-en2019/07/27 09:28:373173---0.622070------NMTNoTransformer Base, relative position, BT, r2l reranking, ensemble of 3 models
7AISTAIja-en2019/08/01 10:44:493260---0.620640------NMTNoTransformer (big), 1.5M sentences, train_steps=300000, Averaged the last 10 ckpts, by Tensor2Tensor.
8ykkdja-en2019/07/26 12:26:032989---0.619640------NMTNoFully Character-level,6 bi-LSTM (512*2) for Encoder, 6 LSTM for Decoder. Middle dense layer. Beam w/4.Length norm set to 0.2
9srcbja-en2018/09/16 14:51:472474---0.619390----0.0000000.000000NMTNoTransformer with relative position, ensemble of 3 models.
10NICT-5ja-en2019/07/16 17:11:382718---0.619040------NMTNoRSNMT 6 layer with distillation
11NICT-5ja-en2018/09/10 14:55:372273---0.612060----0.0000000.000000NMTNoMLNMT
12NAISTja-en2015/08/14 17:46:436550.0000000.0000000.0000000.6094300.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
13NICT-5ja-en2019/07/26 18:02:473041---0.608450------NMTNoRSNMT 6 layer
14NICT-5ja-en2018/08/27 15:01:052174---0.608070----0.0000000.000000NMTNoTransformer vanilla model using 3M sentences.
15NAISTja-en2015/08/24 23:53:537570.0000000.0000000.0000000.6066000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
16NICT-2ja-en2019/07/26 22:37:223086---0.606190------NMTNoTransformer, ensemble of 4 models w/ long warm-up and self-training
17srcbja-en2018/08/26 11:09:502152---0.605130----0.0000000.000000NMTNoTransformer, average checkpoints.
18TOSHIBAja-en2015/07/23 15:00:125060.0000000.0000000.0000000.6047600.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
19NAISTja-en2014/07/19 01:04:48460.0000000.0000000.0000000.6041800.0000000.0000000.0000000.0000000.0000000.000000SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
20NAISTja-en2014/07/31 11:40:531190.0000000.0000000.0000000.6034900.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
21Kyoto-Uja-en2015/08/30 13:02:188290.0000000.0000000.0000000.6032100.0000000.0000000.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
22NAISTja-en2014/08/01 17:35:161250.0000000.0000000.0000000.6027800.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
23NICT-2ja-en2019/07/26 22:33:303085---0.602770------NMTNoTransformer, sigle model w/ long warm-up and self-training
24NICT-5ja-en2021/03/18 23:22:534574---0.601460------NMTNoMy NMT implementation. Beam size 8. LP 0.6
25TMUja-en2018/09/16 12:04:362464---0.600730----0.0000000.000000NMTNoEnsemble of 6 Baseline-NMT
26NAISTja-en2015/08/25 13:03:487670.0000000.0000000.0000000.6000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System Baseline
27NTTja-en2017/08/01 15:53:151732---0.599920---0.0000000.0000000.000000NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data)
28NAISTja-en2015/08/25 13:02:457660.0000000.0000000.0000000.5998000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System with Parser Self Training
29TMUja-en2018/09/16 12:06:392466---0.599110----0.0000000.000000NMTNoReconstructor-NMT ( Single )
30TMUja-en2018/09/16 17:02:152483---0.598770----0.0000000.000000NMTNoEnsemble of 6 NMT ( 2 Baseline + 2 Reconstructor + 2 GAN )
31NTTja-en2017/08/01 04:29:021681---0.597860---0.0000000.0000000.000000NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
32TOSHIBAja-en2015/07/28 16:44:275290.0000000.0000000.0000000.5978300.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
33NTTja-en2017/07/30 20:43:071616---0.597620---0.0000000.0000000.000000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
34NTTja-en2017/08/01 14:50:511724---0.597470---0.0000000.0000000.000000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data)
35TMUja-en2018/09/16 11:53:242461---0.596590----0.0000000.000000NMTNoBaseline-NMT ( Single )
36Kyoto-Uja-en2015/08/27 14:40:327960.0000000.0000000.0000000.5964300.0000000.0000000.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
37TMUja-en2017/08/04 11:16:361750---0.596360---0.0000000.0000000.000000NMTNobeam_size: 10, ensemble of different dropout rates.
38NICT-2ja-en2016/08/05 17:50:251104---0.595930---0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
39TMUja-en2018/09/16 12:05:472465---0.595850----0.0000000.000000NMTNoGAN-NMT ( Single )
40ORGANIZERja-en2017/08/02 01:03:081736---0.595580---0.0000000.0000000.000000NMTNoGoogle's "Attention Is All You Need"
41ORGANIZERja-en2018/08/14 11:07:471901---0.595370----0.0000000.000000NMTNoNMT with Attention
42TMUja-en2017/08/01 11:35:081703---0.595260---0.0000000.0000000.000000NMTNobaseline system with beam20
43Kyoto-Uja-en2016/08/19 01:31:011189---0.595240---0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
44NAISTja-en2016/08/27 00:05:381275---0.594150---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon and MinRisk Training 6 Ensemble
45Kyoto-Uja-en2014/08/19 10:24:061360.0000000.0000000.0000000.5939700.0000000.0000000.0000000.0000000.0000000.000000EBMTNoOur baseline system using 3M parallel sentences.
46Kyoto-Uja-en2014/08/31 23:36:502560.0000000.0000000.0000000.5936600.0000000.0000000.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
47ORGANIZERja-en2014/07/11 19:59:5590.0000000.0000000.0000000.5934100.0000000.0000000.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2014)
48ORGANIZERja-en2015/09/10 13:41:038770.0000000.0000000.0000000.5934100.0000000.0000000.0000000.0000000.0000000.000000SMTNoString-to-Tree SMT (2015)
49Senseja-en2015/08/28 19:25:258220.0000000.0000000.0000000.5920400.0000000.0000000.0000000.0000000.0000000.000000SMTNoBaseline-2015 (train1 only)
50Kyoto-Uja-en2017/08/01 16:38:211733---0.591160---0.0000000.0000000.000000NMTNoEnsemble of 4 BPE, averaged Coverage penalty
51TMUja-en2015/08/04 16:32:205780.0000000.0000000.0000000.5909800.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur PBSMT baseline (2015)
52ORGANIZERja-en2014/07/11 19:49:5760.0000000.0000000.0000000.5909500.0000000.0000000.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
53ORGANIZERja-en2014/07/11 19:45:3220.0000000.0000000.0000000.5888800.0000000.0000000.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
54Kyoto-Uja-en2014/09/01 10:27:542620.0000000.0000000.0000000.5884800.0000000.0000000.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
55TMUja-en2017/08/01 12:14:411712---0.588360---0.0000000.0000000.000000NMTNothe ensemble system of different dropout rate.
56Osaka-Uja-en2018/09/15 23:05:122440---0.588290----0.0000000.000000NMTYesrewarding model
57NICTja-en2015/07/16 13:27:584880.0000000.0000000.0000000.5875300.0000000.0000000.0000000.0000000.0000000.000000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
58Senseja-en2014/08/23 05:34:031640.0000000.0000000.0000000.5875200.0000000.0000000.0000000.0000000.0000000.000000SMTNoParaphrase max10
59NAISTja-en2016/08/09 16:14:051122---0.587450---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
60TMUja-en2017/08/01 11:10:491695---0.585710---0.0000000.0000000.000000NMTNoour baseline system in 2017
61Kyoto-Uja-en2017/08/01 13:42:251717---0.585540---0.0000000.0000000.000000NMTNoEnsemble of 4 BPE averaged parameters
62ORGANIZERja-en2016/11/16 10:29:511333---0.584390---0.0000000.0000000.000000NMTYesOnline D (2016/11/14)
63CUNIja-en2017/07/31 22:30:521665---0.583780---0.0000000.0000000.000000NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
64NIIja-en2014/09/02 11:42:012710.0000000.0000000.0000000.5828000.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur Baseline
65Senseja-en2015/09/01 17:42:588610.0000000.0000000.0000000.5823700.0000000.0000000.0000000.0000000.0000000.000000SMTYesPervasive JSTx1
66Senseja-en2015/08/18 21:58:087080.0000000.0000000.0000000.5815400.0000000.0000000.0000000.0000000.0000000.000000SMTYesPervasive JSTx1
67TMUja-en2014/09/09 19:14:423070.0000000.0000000.0000000.5805000.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system
68Senseja-en2015/09/01 17:42:288600.0000000.0000000.0000000.5797900.0000000.0000000.0000000.0000000.0000000.000000SMTYesPassive JSTx1
69Senseja-en2015/08/29 04:32:368240.0000000.0000000.0000000.5792800.0000000.0000000.0000000.0000000.0000000.000000SMTNoBaseline-2015 (train123)
70Senseja-en2015/08/18 21:54:397020.0000000.0000000.0000000.5792100.0000000.0000000.0000000.0000000.0000000.000000SMTYesPassive JSTx1
71TMUja-en2014/09/07 23:32:493010.0000000.0000000.0000000.5783600.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system with another preordering method
72NICT-2ja-en2017/07/26 14:04:381480---0.578150---0.0000000.0000000.000000NMTNoNMT 6 Ensembles * Bi-directional Reranking
73EIWAja-en2014/07/30 16:07:141160.0000000.0000000.0000000.5765400.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
74NICTja-en2015/07/17 11:02:104920.0000000.0000000.0000000.5765100.0000000.0000000.0000000.0000000.0000000.000000SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
75NICT-2ja-en2017/07/26 13:54:281476---0.574810---0.0000000.0000000.000000NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
76NIIja-en2014/09/02 11:42:532720.0000000.0000000.0000000.5740300.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur Baseline with Preordering
77Osaka-Uja-en2018/09/16 13:11:482472---0.571400----0.0000000.000000SMTNopreordering with neural network
78NAISTja-en2016/08/20 15:33:121247---0.571360---0.0000000.0000000.000000SMTNoNeural MT w/ Lexicon 6 Ensemble
79TMUja-en2015/09/01 05:46:508470.0000000.0000000.0000000.5704300.0000000.0000000.0000000.0000000.0000000.000000SMTNoPBSMT with dependency based phrase segmentation
80TMUja-en2016/08/20 07:39:021222---0.565270---0.0000000.0000000.000000NMTNo2016 our proposed method to control output voice
81Senseja-en2015/07/28 22:23:435300.0000000.0000000.0000000.5646100.0000000.0000000.0000000.0000000.0000000.000000SMTNoBaseline-2015
82ORGANIZERja-en2016/07/26 11:37:381042---0.564270---0.0000000.0000000.000000OtherYesOnline D (2016)
83ORGANIZERja-en2014/07/18 11:08:13350.0000000.0000000.0000000.5641700.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline D (2014)
84NICTja-en2015/07/17 08:51:454890.0000000.0000000.0000000.5629200.0000000.0000000.0000000.0000000.0000000.000000SMTNoour baseline: PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA)
85Kyoto-Uja-en2016/08/20 15:07:471246---0.562650---0.0000000.0000000.000000NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
86ORGANIZERja-en2015/08/25 18:57:257750.0000000.0000000.0000000.5622700.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline D (2015)
87ORGANIZERja-en2014/07/21 11:53:48760.0000000.0000000.0000000.5616200.0000000.0000000.0000000.0000000.0000000.000000OtherYesRBMT E
88TMUja-en2014/09/07 23:28:043000.0000000.0000000.0000000.5614500.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system with preordering method
89Kyoto-Uja-en2016/08/18 15:17:051182---0.558540---0.0000000.0000000.000000NMTNoEnsemble of 4 single-layer model (30k voc)
90ORGANIZERja-en2014/07/21 11:57:08790.0000000.0000000.0000000.5568400.0000000.0000000.0000000.0000000.0000000.000000OtherYesRBMT F
91TOSHIBAja-en2014/08/29 18:47:442400.0000000.0000000.0000000.5529800.0000000.0000000.0000000.0000000.0000000.000000RBMTYesRBMT system
92TOSHIBAja-en2014/08/29 18:48:242410.0000000.0000000.0000000.5517400.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
93ORGANIZERja-en2014/07/23 14:52:31960.0000000.0000000.0000000.5516900.0000000.0000000.0000000.0000000.0000000.000000OtherYesRBMT D (2014)
94ORGANIZERja-en2015/09/10 14:38:138870.0000000.0000000.0000000.5516900.0000000.0000000.0000000.0000000.0000000.000000OtherYesRBMT D (2015)
95TMUja-en2016/08/20 14:31:481234---0.546880---0.0000000.0000000.000000NMTNo 6 ensemble
96bjtu_nlpja-en2016/08/17 19:51:241168---0.505730---0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
97ORGANIZERja-en2014/07/22 11:22:40870.0000000.0000000.0000000.4664800.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline C (2014)
98ORGANIZERja-en2015/09/11 10:54:338920.0000000.0000000.0000000.4533700.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline C (2015)

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NTTja-en2019/07/28 11:18:59322514.000SMTNoASPEC first 1.5M +last 1.5M (fwd), 6 ensemble
2KNU_Hyundaija-en2019/07/27 09:28:37317311.750NMTNoTransformer Base, relative position, BT, r2l reranking, ensemble of 3 models
3NICT-2ja-en2019/07/26 22:37:2230869.500NMTNoTransformer, ensemble of 4 models w/ long warm-up and self-training
4srcbja-en2019/07/27 15:27:0132056.500NMTNoTransformer (Big) with relative position, data augmentation, average checkpoints, Bayes re-ranking.

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NICT-5ja-en2018/08/27 15:01:05217415.750NMTNoTransformer vanilla model using 3M sentences.
2NICT-5ja-en2018/09/10 14:55:37227311.500NMTNoMLNMT
3srcbja-en2018/09/16 14:51:4724745.750NMTNoTransformer with relative position, ensemble of 3 models.
4TMUja-en2018/09/16 12:04:362464-20.000NMTNoEnsemble of 6 Baseline-NMT
5Osaka-Uja-en2018/09/15 23:05:122440-37.000NMTYesrewarding model
6Osaka-Uja-en2018/09/16 13:11:482472-95.750SMTNopreordering with neural network

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-Uja-en2017/08/01 13:42:25171777.750NMTNoEnsemble of 4 BPE averaged parameters
2NTTja-en2017/08/01 04:29:02168177.250NMTNoEnsemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
3ORGANIZERja-en2017/08/02 01:03:08173675.250NMTNoGoogle's "Attention Is All You Need"
4NTTja-en2017/07/30 20:43:07161675.000NMTNoSingle Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking
5Kyoto-Uja-en2017/08/01 16:38:21173374.500NMTNoEnsemble of 4 BPE, averaged Coverage penalty
6NICT-2ja-en2017/07/26 14:04:38148069.750NMTNoNMT 6 Ensembles * Bi-directional Reranking
7NICT-2ja-en2017/07/26 13:54:28147668.750NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
8CUNIja-en2017/07/31 22:30:52166566.000NMTNoBahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences
9TMUja-en2017/08/01 11:35:08170361.000NMTNobaseline system with beam20
10TMUja-en2017/08/01 11:10:49169556.750NMTNoour baseline system in 2017

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1ORGANIZERja-en2016/11/16 10:29:51133363.000NMTYesOnline D (2016/11/14)
2NAISTja-en2016/08/09 16:14:05112248.250SMTNoNeural MT w/ Lexicon and MinRisk Training 4 Ensemble
3NAISTja-en2016/08/20 15:33:12124747.500SMTNoNeural MT w/ Lexicon 6 Ensemble
4Kyoto-Uja-en2016/08/20 15:07:47124647.000NMTNovoc src:200k voc tgt: 52k + BPE 2-layer self-ensembling
5Kyoto-Uja-en2016/08/18 15:17:05118244.250NMTNoEnsemble of 4 single-layer model (30k voc)
6ORGANIZERja-en2016/07/26 11:37:38104228.000OtherYesOnline D (2016)
7TMUja-en2016/08/20 14:31:48123425.000NMTNo 6 ensemble
8bjtu_nlpja-en2016/08/17 19:51:24116819.250NMTNoRNN Encoder-Decoder with attention mechanism, single model
9TMUja-en2016/08/20 07:39:02122216.000NMTNo2016 our proposed method to control output voice
10NICT-2ja-en2016/08/05 17:50:251104UnderwaySMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NAISTja-en2015/08/14 17:46:4365535.500SMTNoTravatar System with NeuralMT Reranking and Parser Self Training
2Kyoto-Uja-en2015/08/30 13:02:1882932.500EBMTNoKyotoEBMT system with bilingual RNNLM reranking
3TOSHIBAja-en2015/07/28 16:44:2752925.000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
4TOSHIBAja-en2015/07/23 15:00:1250621.250SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
5ORGANIZERja-en2015/09/10 14:38:1388716.750OtherYesRBMT D (2015)
6Kyoto-Uja-en2015/08/27 14:40:3279616.500EBMTNoKyotoEBMT system without reranking
7NICTja-en2015/07/16 13:27:5848816.000SMTNoour baseline (DL=6) + dependency-based pre-reordering [Ding+ 2015]
8NAISTja-en2015/08/25 13:02:4576611.750SMTNoTravatar System with Parser Self Training
9ORGANIZERja-en2015/09/10 13:41:038777.000SMTNoString-to-Tree SMT (2015)
10NICTja-en2015/07/17 11:02:104926.500SMTNoour baseline (DL=9) + reverse pre-reordering [Katz-Brown & Collins 2008]
11ORGANIZERja-en2015/08/25 18:57:257750.250OtherYesOnline D (2015)
12Senseja-en2015/09/01 17:42:28860-7.750SMTYesPassive JSTx1
13Senseja-en2015/09/01 17:42:58861-12.750SMTYesPervasive JSTx1
14TMUja-en2015/09/01 05:46:50847-25.500SMTNoPBSMT with dependency based phrase segmentation

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NAISTja-en2014/07/19 01:04:484640.500SMTYesTravatar-based Forest-to-String SMT System with Extra Dictionaries
2NAISTja-en2014/07/31 11:40:5311937.500SMTNoTravatar-based Forest-to-String SMT System
3ORGANIZERja-en2014/07/11 19:59:55925.500SMTNoString-to-Tree SMT (2014)
4Kyoto-Uja-en2014/09/01 10:27:5426225.000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
5TOSHIBAja-en2014/08/29 18:48:2424123.250SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
6ORGANIZERja-en2014/07/23 14:52:319623.000OtherYesRBMT D (2014)
7EIWAja-en2014/07/30 16:07:1411622.500SMT and RBMTYesCombination of RBMT and SPE(statistical post editing)
8Kyoto-Uja-en2014/08/31 23:36:5025621.250EBMTNoOur new baseline system after several modifications.
9TOSHIBAja-en2014/08/29 18:47:4424020.250RBMTYesRBMT system
10ORGANIZERja-en2014/07/18 11:08:133513.750OtherYesOnline D (2014)
11ORGANIZERja-en2014/07/11 19:45:3227.750SMTNoHierarchical Phrase-based SMT (2014)
12Senseja-en2014/08/23 05:34:031641.250SMTNoParaphrase max10
13NIIja-en2014/09/02 11:42:01271-5.750SMTNoOur Baseline
14NIIja-en2014/09/02 11:42:53272-14.250SMTNoOur Baseline with Preordering
15TMUja-en2014/09/07 23:32:49301-17.000SMTNoOur baseline system with another preordering method
16TMUja-en2014/09/07 23:28:04300-17.250SMTNoOur baseline system with preordering method

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02