# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | srcb | en-ja | 2018/09/16 15:26:37 | 2479 | 0.781000 | 0.781000 | 0.781000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Transformer with relative position, ensemble of 3 models. |
2 | srcb | en-ja | 2018/09/16 15:52:17 | 2480 | 0.779820 | 0.779820 | 0.779820 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Transformer with relative position, ensemble of 4 models, rerank. |
3 | NICT-5 | en-ja | 2018/09/03 16:49:57 | 2219 | 0.779560 | 0.779560 | 0.779560 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Big Bidirectional Transformer. 1.5M sentences only. |
4 | NTT | en-ja | 2019/07/28 15:44:12 | 3236 | 0.774950 | 0.774950 | 0.774950 | - | - | - | - | - | - | - | NMT | No | ASPEC first 1.5M + Synthetic 1.5M, 6 ensemble |
5 | NTT | en-ja | 2019/07/28 15:55:18 | 3239 | 0.772590 | 0.772590 | 0.772590 | - | - | - | - | - | - | - | NMT | Yes | ParaCrawl + (ASPEC first 1.5M + Synthetic 1.5M) * 2 oversampling, fine-tune ASPEC,
SINGLE MODEL |
6 | NICT-5 | en-ja | 2018/08/22 18:43:00 | 2048 | 0.771400 | 0.771400 | 0.771400 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Tensor2Tensor's transformer implementation. Used recurrently stacked layers model logic. Layer recurrence logic: 1-2-3-1-2-3-1-2-3-1-2-3. Codename: Megarecurrence. |
7 | Kyoto-U | en-ja | 2017/09/04 22:31:18 | 1759 | 0.771020 | 0.771020 | 0.771020 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Syscomb of AIAYN and KNMT |
8 | NICT-5 | en-ja | 2018/08/22 18:48:51 | 2050 | 0.770560 | 0.770560 | 0.770560 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | RSNMT 6 layer |
9 | NAIST-NICT | en-ja | 2017/07/27 21:50:15 | 1507 | 0.770480 | 0.770480 | 0.770480 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Single Model: SPM16k/16k, BiLSTM Encoder 512*2*2, UniLSTM Decoder 512*2, Adjusted Search |
10 | srcb | en-ja | 2019/07/27 15:50:19 | 3212 | 0.770440 | 0.770440 | 0.770440 | - | - | - | - | - | - | - | NMT | No | Transformer(Big) with relative position, sentence-wise smooth, deep transformer, back translation, ensemble of 10 models, rerank. |
11 | ORGANIZER | en-ja | 2017/08/02 01:04:49 | 1737 | 0.768630 | 0.768630 | 0.768630 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Google's "Attention Is All You Need" |
12 | NICT-2 | en-ja | 2017/07/26 14:03:00 | 1479 | 0.765580 | 0.765580 | 0.765580 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT 6 Ensembles * Bi-directional Reranking |
13 | srcb | en-ja | 2019/07/27 15:28:30 | 3206 | 0.765330 | 0.765330 | 0.765330 | - | - | - | - | - | - | - | NMT | No | Transformer(Big) with relative position, sentence-wise smooth, deep transformer, back translation, ensemble of 7 models, re-rank. |
14 | NTT | en-ja | 2017/08/01 02:22:56 | 1673 | 0.763770 | 0.763770 | 0.763770 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Single Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data) |
15 | NAIST-NICT | en-ja | 2017/07/27 21:49:04 | 1506 | 0.763310 | 0.763310 | 0.763310 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Single Model: SPM16k/16k, BiLSTM Encoder 512*2*2, UniLSTM Decoder 512*2, One-best Search |
16 | Osaka-U | en-ja | 2018/09/15 23:02:05 | 2439 | 0.763140 | 0.763140 | 0.763140 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | rewarding model |
17 | NTT | en-ja | 2017/08/01 15:42:13 | 1729 | 0.762170 | 0.762170 | 0.762170 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking, add 1M pseudo training data (generated from provided training data) |
18 | NICT-2 | en-ja | 2019/07/27 10:57:26 | 3182 | 0.761340 | 0.761340 | 0.761340 | - | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models w/ long warm-up and self-training |
19 | AISTAI | en-ja | 2019/08/30 20:23:24 | 3357 | 0.760410 | 0.760410 | 0.760410 | - | - | - | - | - | - | - | NMT | No | Transformer, 1.5M sentences, relative position, ensemble of 3 models, by OpenNMT-py. |
20 | KNU_Hyundai | en-ja | 2019/07/27 09:19:31 | 3172 | 0.760050 | 0.760050 | 0.760050 | - | - | - | - | - | - | - | NMT | No | Transformer Base, relative position, BT, r2l reranking, ensemble of 3 models |
21 | AISTAI | en-ja | 2019/09/05 08:18:45 | 3373 | 0.759940 | 0.759940 | 0.759940 | - | - | - | - | - | - | - | NMT | No | Transformer (big), 1.5M sentences, train_steps=300000, Averaged the last 20 ckpts, by Tensor2Tensor. |
22 | ORGANIZER | en-ja | 2018/08/14 10:58:49 | 1900 | 0.759910 | 0.759910 | 0.759910 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
23 | NICT-2 | en-ja | 2017/07/26 13:52:35 | 1475 | 0.759570 | 0.759570 | 0.759570 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder |
24 | srcb | en-ja | 2019/07/25 11:43:06 | 2918 | 0.759510 | 0.759510 | 0.759510 | - | - | - | - | - | - | - | NMT | No | Transformer (Big) with relative position, sentence-wise smooth |
25 | NICT-2 | en-ja | 2019/07/27 10:56:02 | 3181 | 0.759260 | 0.759260 | 0.759260 | - | - | - | - | - | - | - | NMT | No | Transformer, sigle model w/ long warm-up and self-training |
26 | EHR | en-ja | 2018/09/08 12:42:03 | 2245 | 0.758750 | 0.758750 | 0.758750 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | SMT reranked NMT |
27 | NAIST | en-ja | 2014/08/01 17:37:23 | 126 | 0.758740 | 0.758740 | 0.758740 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar-based Forest-to-String SMT System (Tuned BLEU+RIBES) |
28 | NTT | en-ja | 2017/08/01 07:12:25 | 1684 | 0.757740 | 0.757740 | 0.757740 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble 8 Models: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking |
29 | AISTAI | en-ja | 2019/07/29 07:33:56 | 3251 | 0.757590 | 0.757590 | 0.757590 | - | - | - | - | - | - | - | NMT | No | Transformer (big). 1.5M sentences, train_steps=131000 only. Averaged the last 10 ckpts. |
30 | NTT | en-ja | 2017/07/30 16:16:25 | 1608 | 0.756480 | 0.756480 | 0.756480 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Single Model: joint BPE 16k, BiLSTM Encoder 512*2*2, LtoR LSTM Decoder 512*2, Beam Search 20 w/ length-based reranking |
31 | Kyoto-U | en-ja | 2017/08/01 15:49:51 | 1731 | 0.754220 | 0.754220 | 0.754220 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble of 4
BPE Averaged |
32 | UT-KAY | en-ja | 2017/06/27 16:58:42 | 1369 | 0.753090 | 0.753090 | 0.753090 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Hashimoto and Tsuruoka (2017), https://arxiv.org/abs/1702.02265 |
33 | NICT-2 | en-ja | 2016/08/05 17:47:05 | 1097 | 0.753080 | 0.753080 | 0.753080 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM |
34 | TMU | en-ja | 2018/09/16 12:44:23 | 2469 | 0.753040 | 0.753040 | 0.753040 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Ensemble of 6 Baseline-NMT |
35 | NAIST | en-ja | 2014/07/31 11:38:37 | 118 | 0.752850 | 0.752850 | 0.752850 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar-based Forest-to-String SMT System |
36 | TMU | en-ja | 2018/09/16 17:25:36 | 2484 | 0.752420 | 0.752420 | 0.752420 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Ensemble of 6 NMT ( 2 Baseline + 2 Reconstructor + 2 GAN ) |
37 | NAIST | en-ja | 2015/08/25 12:47:49 | 763 | 0.752260 | 0.752260 | 0.752260 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System Baseline |
38 | TMU | en-ja | 2018/09/16 16:56:49 | 2482 | 0.752100 | 0.752100 | 0.752100 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Reconstructor-NMT ( Single ) |
39 | TMU | en-ja | 2017/08/02 10:20:14 | 1741 | 0.751660 | 0.751660 | 0.751660 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | the ensemble system of different dropout rate. |
40 | TMU | en-ja | 2018/09/16 12:11:58 | 2468 | 0.750350 | 0.750350 | 0.750350 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | GAN-NMT ( Single ) |
41 | TMU | en-ja | 2017/08/04 11:06:51 | 1747 | 0.749410 | 0.749410 | 0.749410 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | beam_size: 10, ensemble of different dropout rates. |
42 | TMU | en-ja | 2018/09/16 12:10:49 | 2467 | 0.749190 | 0.749190 | 0.749190 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Baseline-NMT ( Single ) |
43 | NAIST | en-ja | 2015/08/25 12:39:23 | 761 | 0.748870 | 0.748870 | 0.748870 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System with NeuralMT Reranking |
44 | W | en-ja | 2014/08/26 16:17:15 | 202 | 0.747650 | 0.747650 | 0.747650 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Weblio Pre-reordering SMT System (with forest inputs) |
45 | Kyoto-U | en-ja | 2016/08/19 10:18:09 | 1201 | 0.747510 | 0.747510 | 0.747510 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT 2016 w/o reranking |
46 | UT-IIS | en-ja | 2017/08/01 12:03:22 | 1710 | 0.746910 | 0.746910 | 0.746910 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT 8 Ensembles with beam search, Sentence Piece, Embedding layer initialization |
47 | EHR | en-ja | 2016/08/15 11:21:51 | 1140 | 0.746720 | 0.746720 | 0.746720 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | PBSMT with preordering (DL=6) |
48 | W | en-ja | 2014/08/16 00:57:16 | 132 | 0.746570 | 0.746570 | 0.746570 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Weblio Pre-reordering SMT System Baseline |
49 | TMU | en-ja | 2017/08/01 11:56:31 | 1709 | 0.744890 | 0.744890 | 0.744890 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | baseline system with beam20 |
50 | TOSHIBA | en-ja | 2015/07/30 11:22:11 | 540 | 0.744660 | 0.744660 | 0.744660 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | System combination SMT and RBMT(SPE) with RNNLM language model |
51 | naver | en-ja | 2015/08/31 14:18:18 | 837 | 0.744630 | 0.744630 | 0.744630 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | SMT t2s + Spell correction + NMT reranking |
52 | ORGANIZER | en-ja | 2014/07/11 20:03:21 | 12 | 0.744370 | 0.744370 | 0.744370 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Tree-to-String SMT (2014) |
53 | ORGANIZER | en-ja | 2015/09/10 13:29:35 | 875 | 0.744370 | 0.744370 | 0.744370 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Tree-to-String SMT (2015) |
54 | ORGANIZER | en-ja | 2014/09/16 13:36:35 | 367 | 0.743900 | 0.743900 | 0.743900 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Hierarchical Phrase-based SMT (2014) |
55 | Kyoto-U | en-ja | 2015/08/30 19:59:18 | 832 | 0.743710 | 0.743710 | 0.743710 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT system with bilingual RNNLM reranking |
56 | TOSHIBA | en-ja | 2015/07/28 16:24:37 | 524 | 0.742100 | 0.742100 | 0.742100 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with SPE(Statistical Post Editing) system |
57 | Kyoto-U | en-ja | 2014/08/31 10:33:23 | 253 | 0.741710 | 0.741710 | 0.741710 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our new baseline system after several modifications. |
58 | Sense | en-ja | 2018/08/24 11:38:23 | 2092 | 0.741090 | 0.741090 | 0.741090 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | |
59 | Kyoto-U | en-ja | 2015/08/27 22:56:13 | 805 | 0.741050 | 0.741050 | 0.741050 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT system without reranking |
60 | TMU | en-ja | 2017/08/01 11:42:03 | 1704 | 0.740620 | 0.740620 | 0.740620 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | our baseline system in 2017 |
61 | SAS_MT | en-ja | 2014/09/01 10:39:27 | 264 | 0.740580 | 0.740580 | 0.740580 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Syntactic reordering Hierarchical SMT (using part of data) |
62 | naver | en-ja | 2015/08/25 16:20:30 | 770 | 0.740210 | 0.740210 | 0.740210 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | SMT t2s + Spell correction |
63 | Kyoto-U | en-ja | 2016/08/18 03:45:51 | 1172 | 0.738700 | 0.738700 | 0.738700 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | BPE tgt/src: 52k
2-layer lstm
self-ensemble of 3 |
64 | Kyoto-U | en-ja | 2014/09/01 21:06:32 | 267 | 0.738680 | 0.738680 | 0.738680 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking |
65 | Kyoto-U | en-ja | 2014/08/19 10:16:02 | 134 | 0.737890 | 0.737890 | 0.737890 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our baseline system using 3M parallel sentences. |
66 | naver | en-ja | 2015/08/04 16:48:32 | 581 | 0.737590 | 0.737590 | 0.737590 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | SMT t2s |
67 | Sense | en-ja | 2015/08/28 19:22:24 | 821 | 0.737560 | 0.737560 | 0.737560 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-2015 (train1 only) |
68 | ORGANIZER | en-ja | 2014/07/11 19:49:03 | 5 | 0.736380 | 0.736380 | 0.736380 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT |
69 | Sense | en-ja | 2014/08/25 01:07:27 | 184 | 0.733360 | 0.733360 | 0.733360 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline SMT |
70 | Kyoto-U | en-ja | 2014/08/25 14:16:15 | 186 | 0.732380 | 0.732380 | 0.732380 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Using n-best parses and RNNLM. |
71 | Sense | en-ja | 2015/07/28 22:26:55 | 531 | 0.731620 | 0.731620 | 0.731620 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-2015 |
72 | UT-AKY | en-ja | 2016/08/20 12:34:43 | 1228 | 0.731440 | 0.731440 | 0.731440 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | tree-to-seq NMT model (word-based decoder) |
73 | Sense | en-ja | 2015/08/18 22:04:09 | 715 | 0.731400 | 0.731400 | 0.731400 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Passive JSTx3 |
74 | Sense | en-ja | 2015/08/18 21:52:01 | 700 | 0.729200 | 0.729200 | 0.729200 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-dictmt |
75 | ORGANIZER | en-ja | 2016/11/16 10:50:13 | 1334 | 0.727040 | 0.727040 | 0.727040 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | Online A (2016/11/14) |
76 | naver | en-ja | 2015/08/31 14:12:03 | 836 | 0.725860 | 0.725860 | 0.725860 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | NMT only |
77 | W | en-ja | 2015/08/26 16:00:46 | 786 | 0.725370 | 0.725370 | 0.725370 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | NMT, LSTM Search, 5 ensembles, beam size 20, UNK replacing, System Combination with NMT score (Pick top-1k results from NMT) |
78 | TOKYOMT | en-ja | 2016/08/19 23:18:46 | 1217 | 0.720810 | 0.720810 | 0.720810 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Combination of NMT and T2S |
79 | UT-AKY | en-ja | 2016/08/20 09:25:15 | 1224 | 0.708140 | 0.708140 | 0.708140 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | tree-to-seq NMT model (character-based decoder) |
80 | W | en-ja | 2015/08/28 14:30:56 | 813 | 0.705340 | 0.705340 | 0.705340 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | No | NMT, LSTM Search, Beam Size 20, Ensemble of 2 models, UNK replacing |
81 | TOKYOMT | en-ja | 2016/08/12 11:21:43 | 1131 | 0.705210 | 0.705210 | 0.705210 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | char 1 , ens 2 , version 1 |
82 | Osaka-U | en-ja | 2018/09/16 13:06:45 | 2470 | 0.705050 | 0.705050 | 0.705050 | - | - | - | - | - | 0.000000 | 0.000000 | SMT | No | preordering with neural network |
83 | bjtu_nlp | en-ja | 2016/08/16 11:21:07 | 1143 | 0.704340 | 0.704340 | 0.704340 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | RNN Encoder-Decoder with attention mechanism, single model |
84 | ORGANIZER | en-ja | 2014/07/18 11:02:25 | 34 | 0.695420 | 0.695420 | 0.695420 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2014) |
85 | ORGANIZER | en-ja | 2015/08/25 18:54:29 | 774 | 0.677200 | 0.677200 | 0.677200 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2015) |
86 | ORGANIZER | en-ja | 2016/07/26 11:31:47 | 1041 | 0.677020 | 0.677020 | 0.677020 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2016) |
87 | JAPIO | en-ja | 2016/08/17 12:50:50 | 1165 | 0.660790 | 0.660790 | 0.660790 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + JAPIO corpus + rule-based posteditor |
88 | ORGANIZER | en-ja | 2015/09/10 19:02:38 | 889 | 0.646160 | 0.646160 | 0.646160 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online B (2015) |
89 | ORGANIZER | en-ja | 2014/07/22 13:30:13 | 91 | 0.643070 | 0.643070 | 0.643070 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online B (2014) |
90 | EHR | en-ja | 2015/09/09 12:21:34 | 873 | 0.630410 | 0.630410 | 0.630410 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | System combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering.
Candidate selection by language model score. |
91 | ORGANIZER | en-ja | 2014/07/21 11:40:50 | 68 | 0.626940 | 0.626940 | 0.626940 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT A |
92 | ORGANIZER | en-ja | 2014/07/21 11:38:12 | 66 | 0.622930 | 0.622930 | 0.622930 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT B (2014) |
93 | ORGANIZER | en-ja | 2015/09/10 14:26:28 | 883 | 0.622930 | 0.622930 | 0.622930 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT B (2015) |
94 | EHR | en-ja | 2015/08/22 12:28:15 | 742 | 0.620500 | 0.620500 | 0.620500 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase based SMT with preordering.
|
95 | EHR | en-ja | 2015/09/09 12:22:58 | 874 | 0.604090 | 0.604090 | 0.604090 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with user dictionary plus SPE. |
96 | ORGANIZER | en-ja | 2014/07/23 14:50:44 | 95 | 0.594380 | 0.594380 | 0.594380 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT C |