# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | ja-en | 2018/08/14 11:07:47 | 1901 | - | - | - | 0.595370 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
2 | srcb | ja-en | 2018/08/26 11:09:50 | 2152 | - | - | - | 0.605130 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Transformer, average checkpoints. |
3 | NICT-5 | ja-en | 2018/08/27 15:01:05 | 2174 | - | - | - | 0.608070 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Transformer vanilla model using 3M sentences. |
4 | NICT-5 | ja-en | 2018/09/10 14:55:37 | 2273 | - | - | - | 0.612060 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | MLNMT |
5 | Osaka-U | ja-en | 2018/09/15 23:05:12 | 2440 | - | - | - | 0.588290 | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | rewarding model |
6 | TMU | ja-en | 2018/09/16 11:53:24 | 2461 | - | - | - | 0.596590 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Baseline-NMT ( Single ) |
7 | TMU | ja-en | 2018/09/16 12:04:36 | 2464 | - | - | - | 0.600730 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Ensemble of 6 Baseline-NMT |
8 | TMU | ja-en | 2018/09/16 12:05:47 | 2465 | - | - | - | 0.595850 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | GAN-NMT ( Single ) |
9 | TMU | ja-en | 2018/09/16 12:06:39 | 2466 | - | - | - | 0.599110 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Reconstructor-NMT ( Single ) |
10 | Osaka-U | ja-en | 2018/09/16 13:11:48 | 2472 | - | - | - | 0.571400 | - | - | - | - | 0.000000 | 0.000000 | SMT | No | preordering with neural network |
11 | srcb | ja-en | 2018/09/16 14:51:47 | 2474 | - | - | - | 0.619390 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Transformer with relative position, ensemble of 3 models. |
12 | TMU | ja-en | 2018/09/16 17:02:15 | 2483 | - | - | - | 0.598770 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Ensemble of 6 NMT ( 2 Baseline + 2 Reconstructor + 2 GAN ) |
13 | NICT-5 | ja-en | 2019/07/16 17:11:38 | 2718 | - | - | - | 0.619040 | - | - | - | - | - | - | NMT | No | RSNMT 6 layer with distillation |
14 | srcb | ja-en | 2019/07/25 11:52:48 | 2919 | - | - | - | 0.628070 | - | - | - | - | - | - | NMT | No | Transformer (Big) with relative position, average checkpoints. |
15 | ykkd | ja-en | 2019/07/26 12:26:03 | 2989 | - | - | - | 0.619640 | - | - | - | - | - | - | NMT | No | Fully Character-level,6 bi-LSTM (512*2) for Encoder, 6 LSTM for Decoder. Middle dense layer. Beam w/4.Length norm set to 0.2 |
16 | NICT-5 | ja-en | 2019/07/26 18:02:47 | 3041 | - | - | - | 0.608450 | - | - | - | - | - | - | NMT | No | RSNMT 6 layer |
17 | NICT-2 | ja-en | 2019/07/26 22:33:30 | 3085 | - | - | - | 0.602770 | - | - | - | - | - | - | NMT | No | Transformer, sigle model w/ long warm-up and self-training |
18 | NICT-2 | ja-en | 2019/07/26 22:37:22 | 3086 | - | - | - | 0.606190 | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models w/ long warm-up and self-training |
19 | KNU_Hyundai | ja-en | 2019/07/27 09:28:37 | 3173 | - | - | - | 0.622070 | - | - | - | - | - | - | NMT | No | Transformer Base, relative position, BT, r2l reranking, ensemble of 3 models |
20 | srcb | ja-en | 2019/07/27 15:27:01 | 3205 | - | - | - | 0.630150 | - | - | - | - | - | - | NMT | No | Transformer (Big) with relative position, data augmentation, average checkpoints, Bayes re-ranking. |
21 | NTT | ja-en | 2019/07/28 11:18:59 | 3225 | - | - | - | 0.626880 | - | - | - | - | - | - | SMT | No | ASPEC first 1.5M +last 1.5M (fwd), 6 ensemble |
22 | NTT | ja-en | 2019/07/28 15:11:04 | 3233 | - | - | - | 0.626260 | - | - | - | - | - | - | NMT | Yes | ParaCrawl + (ASPEC first 1.5M + Synthetic 1.5M) * 2 oversampling, fine-tune ASPEC,
SINGLE MODEL |
23 | AISTAI | ja-en | 2019/08/01 10:44:49 | 3260 | - | - | - | 0.620640 | - | - | - | - | - | - | NMT | No | Transformer (big), 1.5M sentences, train_steps=300000, Averaged the last 10 ckpts, by Tensor2Tensor. |
24 | AISTAI | ja-en | 2019/08/31 21:28:13 | 3361 | - | - | - | 0.626300 | - | - | - | - | - | - | NMT | No | Transformer, 1.5M sentences, relative position, ensemble of 4 models, by OpenNMT-py. |
25 | NICT-5 | ja-en | 2021/03/18 23:22:53 | 4574 | - | - | - | 0.601460 | - | - | - | - | - | - | NMT | No | My NMT implementation. Beam size 8. LP 0.6 |
26 | ORGANIZER | ja-en | 2014/07/11 19:45:32 | 2 | 0.000000 | 0.000000 | 0.000000 | 0.588880 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Hierarchical Phrase-based SMT (2014) |
27 | ORGANIZER | ja-en | 2014/07/11 19:49:57 | 6 | 0.000000 | 0.000000 | 0.000000 | 0.590950 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT |
28 | ORGANIZER | ja-en | 2014/07/11 19:59:55 | 9 | 0.000000 | 0.000000 | 0.000000 | 0.593410 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | String-to-Tree SMT (2014) |
29 | ORGANIZER | ja-en | 2014/07/18 11:08:13 | 35 | 0.000000 | 0.000000 | 0.000000 | 0.564170 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online D (2014) |
30 | NAIST | ja-en | 2014/07/19 01:04:48 | 46 | 0.000000 | 0.000000 | 0.000000 | 0.604180 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Travatar-based Forest-to-String SMT System with Extra Dictionaries |
31 | ORGANIZER | ja-en | 2014/07/21 11:53:48 | 76 | 0.000000 | 0.000000 | 0.000000 | 0.561620 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT E |
32 | ORGANIZER | ja-en | 2014/07/21 11:57:08 | 79 | 0.000000 | 0.000000 | 0.000000 | 0.556840 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT F |
33 | ORGANIZER | ja-en | 2014/07/22 11:22:40 | 87 | 0.000000 | 0.000000 | 0.000000 | 0.466480 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online C (2014) |
34 | ORGANIZER | ja-en | 2014/07/23 14:52:31 | 96 | 0.000000 | 0.000000 | 0.000000 | 0.551690 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT D (2014) |
35 | EIWA | ja-en | 2014/07/30 16:07:14 | 116 | 0.000000 | 0.000000 | 0.000000 | 0.576540 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | Combination of RBMT and SPE(statistical post editing) |
36 | NAIST | ja-en | 2014/07/31 11:40:53 | 119 | 0.000000 | 0.000000 | 0.000000 | 0.603490 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar-based Forest-to-String SMT System |
37 | NAIST | ja-en | 2014/08/01 17:35:16 | 125 | 0.000000 | 0.000000 | 0.000000 | 0.602780 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar-based Forest-to-String SMT System (Tuned BLEU+RIBES) |
38 | Kyoto-U | ja-en | 2014/08/19 10:24:06 | 136 | 0.000000 | 0.000000 | 0.000000 | 0.593970 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our baseline system using 3M parallel sentences. |
39 | Sense | ja-en | 2014/08/23 05:34:03 | 164 | 0.000000 | 0.000000 | 0.000000 | 0.587520 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Paraphrase max10 |
40 | TOSHIBA | ja-en | 2014/08/29 18:47:44 | 240 | 0.000000 | 0.000000 | 0.000000 | 0.552980 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | Yes | RBMT system |
41 | TOSHIBA | ja-en | 2014/08/29 18:48:24 | 241 | 0.000000 | 0.000000 | 0.000000 | 0.551740 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with SPE(Statistical Post Editing) system |
42 | Kyoto-U | ja-en | 2014/08/31 23:36:50 | 256 | 0.000000 | 0.000000 | 0.000000 | 0.593660 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our new baseline system after several modifications. |
43 | Kyoto-U | ja-en | 2014/09/01 10:27:54 | 262 | 0.000000 | 0.000000 | 0.000000 | 0.588480 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking |
44 | NII | ja-en | 2014/09/02 11:42:01 | 271 | 0.000000 | 0.000000 | 0.000000 | 0.582800 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our Baseline |
45 | NII | ja-en | 2014/09/02 11:42:53 | 272 | 0.000000 | 0.000000 | 0.000000 | 0.574030 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our Baseline with Preordering |
46 | TMU | ja-en | 2014/09/07 23:28:04 | 300 | 0.000000 | 0.000000 | 0.000000 | 0.561450 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system with preordering method |
47 | TMU | ja-en | 2014/09/07 23:32:49 | 301 | 0.000000 | 0.000000 | 0.000000 | 0.578360 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system with another preordering method |
48 | TMU | ja-en | 2014/09/09 19:14:42 | 307 | 0.000000 | 0.000000 | 0.000000 | 0.580500 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system |
49 | NICT | ja-en | 2015/07/16 13:27:58 | 488 | 0.000000 | 0.000000 | 0.000000 | 0.587530 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | our baseline (DL=6) + dependency-based pre-reordering
[Ding+ 2015] |
50 | NICT | ja-en | 2015/07/17 08:51:45 | 489 | 0.000000 | 0.000000 | 0.000000 | 0.562920 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | our baseline:
PB SMT in MOSES (DL=20) / SRILM / MeCab (IPA) |
51 | NICT | ja-en | 2015/07/17 11:02:10 | 492 | 0.000000 | 0.000000 | 0.000000 | 0.576510 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | our baseline (DL=9) + reverse pre-reordering
[Katz-Brown & Collins 2008] |
52 | TOSHIBA | ja-en | 2015/07/23 15:00:12 | 506 | 0.000000 | 0.000000 | 0.000000 | 0.604760 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | System combination SMT and RBMT(SPE) with RNNLM language model |
53 | TOSHIBA | ja-en | 2015/07/28 16:44:27 | 529 | 0.000000 | 0.000000 | 0.000000 | 0.597830 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with SPE(Statistical Post Editing) system |
54 | Sense | ja-en | 2015/07/28 22:23:43 | 530 | 0.000000 | 0.000000 | 0.000000 | 0.564610 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-2015 |
55 | TMU | ja-en | 2015/08/04 16:32:20 | 578 | 0.000000 | 0.000000 | 0.000000 | 0.590980 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our PBSMT baseline (2015) |
56 | NAIST | ja-en | 2015/08/14 17:46:43 | 655 | 0.000000 | 0.000000 | 0.000000 | 0.609430 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System with NeuralMT Reranking and Parser Self Training |
57 | Sense | ja-en | 2015/08/18 21:54:39 | 702 | 0.000000 | 0.000000 | 0.000000 | 0.579210 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Passive JSTx1 |
58 | Sense | ja-en | 2015/08/18 21:58:08 | 708 | 0.000000 | 0.000000 | 0.000000 | 0.581540 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Pervasive JSTx1 |
59 | NAIST | ja-en | 2015/08/24 23:53:53 | 757 | 0.000000 | 0.000000 | 0.000000 | 0.606600 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System with NeuralMT Reranking |
60 | NAIST | ja-en | 2015/08/25 13:02:45 | 766 | 0.000000 | 0.000000 | 0.000000 | 0.599800 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System with Parser Self Training |
61 | NAIST | ja-en | 2015/08/25 13:03:48 | 767 | 0.000000 | 0.000000 | 0.000000 | 0.600000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System Baseline |
62 | ORGANIZER | ja-en | 2015/08/25 18:57:25 | 775 | 0.000000 | 0.000000 | 0.000000 | 0.562270 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online D (2015) |
63 | Kyoto-U | ja-en | 2015/08/27 14:40:32 | 796 | 0.000000 | 0.000000 | 0.000000 | 0.596430 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT system without reranking |
64 | Sense | ja-en | 2015/08/28 19:25:25 | 822 | 0.000000 | 0.000000 | 0.000000 | 0.592040 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-2015 (train1 only) |
65 | Sense | ja-en | 2015/08/29 04:32:36 | 824 | 0.000000 | 0.000000 | 0.000000 | 0.579280 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-2015 (train123) |
66 | Kyoto-U | ja-en | 2015/08/30 13:02:18 | 829 | 0.000000 | 0.000000 | 0.000000 | 0.603210 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT system with bilingual RNNLM reranking |
67 | TMU | ja-en | 2015/09/01 05:46:50 | 847 | 0.000000 | 0.000000 | 0.000000 | 0.570430 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | PBSMT with dependency based phrase segmentation |
68 | Sense | ja-en | 2015/09/01 17:42:28 | 860 | 0.000000 | 0.000000 | 0.000000 | 0.579790 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Passive JSTx1 |
69 | Sense | ja-en | 2015/09/01 17:42:58 | 861 | 0.000000 | 0.000000 | 0.000000 | 0.582370 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Pervasive JSTx1 |
70 | ORGANIZER | ja-en | 2015/09/10 13:41:03 | 877 | 0.000000 | 0.000000 | 0.000000 | 0.593410 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | String-to-Tree SMT (2015) |
71 | ORGANIZER | ja-en | 2015/09/10 14:38:13 | 887 | 0.000000 | 0.000000 | 0.000000 | 0.551690 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT D (2015) |
72 | ORGANIZER | ja-en | 2015/09/11 10:54:33 | 892 | 0.000000 | 0.000000 | 0.000000 | 0.453370 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online C (2015) |
73 | ORGANIZER | ja-en | 2016/07/26 11:37:38 | 1042 | - | - | - | 0.564270 | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online D (2016) |
74 | NICT-2 | ja-en | 2016/08/05 17:50:25 | 1104 | - | - | - | 0.595930 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM |
75 | NAIST | ja-en | 2016/08/09 16:14:05 | 1122 | - | - | - | 0.587450 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Neural MT w/ Lexicon and MinRisk Training 4 Ensemble |
76 | bjtu_nlp | ja-en | 2016/08/17 19:51:24 | 1168 | - | - | - | 0.505730 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | RNN Encoder-Decoder with attention mechanism, single model |
77 | Kyoto-U | ja-en | 2016/08/18 15:17:05 | 1182 | - | - | - | 0.558540 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble of 4 single-layer model (30k voc) |
78 | Kyoto-U | ja-en | 2016/08/19 01:31:01 | 1189 | - | - | - | 0.595240 | - | - | - | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT 2016 w/o reranking |
79 | TMU | ja-en | 2016/08/20 07:39:02 | 1222 | - | - | - | 0.565270 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | 2016 our proposed method to control output voice |
80 | TMU | ja-en | 2016/08/20 14:31:48 | 1234 | - | - | - | 0.546880 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | 6 ensemble |
81 | Kyoto-U | ja-en | 2016/08/20 15:07:47 | 1246 | - | - | - | 0.562650 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | voc src:200k
voc tgt: 52k + BPE
2-layer
self-ensembling |
82 | NAIST | ja-en | 2016/08/20 15:33:12 | 1247 | - | - | - | 0.571360 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Neural MT w/ Lexicon 6 Ensemble |
83 | NAIST | ja-en | 2016/08/27 00:05:38 | 1275 | - | - | - | 0.594150 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Neural MT w/ Lexicon and MinRisk Training 6 Ensemble |
84 | ORGANIZER | ja-en | 2016/11/16 10:29:51 | 1333 | - | - | - | 0.584390 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | Online D (2016/11/14) |
85 | NICT-2 | ja-en | 2017/07/26 13:54:28 | 1476 | - | - | - | 0.574810 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder |
86 | NICT-2 | ja-en | 2017/07/26 14:04:38 | 1480 | - | - | - | 0.578150 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT 6 Ensembles * Bi-directional Reranking |
87 | NTT | ja-en | 2017/07/30 20:43:07 | 1616 | - | - | - | 0.597620 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Single Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking |
88 | CUNI | ja-en | 2017/07/31 22:30:52 | 1665 | - | - | - | 0.583780 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Bahdanau (2014) seq2seq with conditional GRU on byte-pair encoding, 1M sentences |
89 | NTT | ja-en | 2017/08/01 04:29:02 | 1681 | - | - | - | 0.597860 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking |
90 | TMU | ja-en | 2017/08/01 11:10:49 | 1695 | - | - | - | 0.585710 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | our baseline system in 2017 |
91 | TMU | ja-en | 2017/08/01 11:35:08 | 1703 | - | - | - | 0.595260 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | baseline system with beam20 |
92 | TMU | ja-en | 2017/08/01 12:14:41 | 1712 | - | - | - | 0.588360 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | the ensemble system of different dropout rate. |
93 | Kyoto-U | ja-en | 2017/08/01 13:42:25 | 1717 | - | - | - | 0.585540 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble of 4
BPE
averaged parameters |
94 | NTT | ja-en | 2017/08/01 14:50:51 | 1724 | - | - | - | 0.597470 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Single Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data) |
95 | NTT | ja-en | 2017/08/01 15:53:15 | 1732 | - | - | - | 0.599920 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from training data) |
96 | Kyoto-U | ja-en | 2017/08/01 16:38:21 | 1733 | - | - | - | 0.591160 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble of 4
BPE, averaged
Coverage penalty |
97 | ORGANIZER | ja-en | 2017/08/02 01:03:08 | 1736 | - | - | - | 0.595580 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Google's "Attention Is All You Need" |
98 | TMU | ja-en | 2017/08/04 11:16:36 | 1750 | - | - | - | 0.596360 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | beam_size: 10, ensemble of different dropout rates. |