# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | srcb | ja-zh | 2018/09/16 14:47:09 | 2473 | - | 0.791120 | - | - | 0.791120 | 0.791120 | - | - | 0.000000 | 0.000000 | NMT | No | Transformer with relative position, ensemble of 10 models. |
2 | Kyoto-U+ECNU | ja-zh | 2020/09/17 18:43:01 | 3814 | - | 0.787730 | - | - | 0.787730 | 0.787730 | - | - | - | - | NMT | Yes | ensemble 8 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder) |
3 | srcb | ja-zh | 2018/08/26 11:37:12 | 2153 | - | 0.787570 | - | - | 0.787570 | 0.787570 | - | - | 0.000000 | 0.000000 | NMT | No | Transformer, average checkpoints. |
4 | ORGANIZER | ja-zh | 2017/08/02 01:06:05 | 1738 | - | 0.787250 | - | - | 0.787250 | 0.787250 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Google's "Attention Is All You Need" |
5 | srcb | ja-zh | 2019/07/27 15:34:31 | 3208 | - | 0.787220 | - | - | 0.787220 | 0.787220 | - | - | - | - | NMT | No | Transformer with num_units=768, relative position, sentence-wise smooth, encoding side word drop, norm-based batch filtering, residual connection norm, ensemble of 8 models. |
6 | Kyoto-U+ECNU | ja-zh | 2020/09/19 16:56:52 | 4053 | - | 0.786870 | - | - | 0.786870 | 0.786870 | - | - | - | - | NMT | No | without out-of-domain parallel data; others same as DataID:3814 |
7 | srcb | ja-zh | 2019/07/25 11:30:58 | 2916 | - | 0.786600 | - | - | 0.786600 | 0.786600 | - | - | - | - | NMT | No | Transformer with relative position, sentence-wise smooth, encoder side word drop. |
8 | NICT-2 | ja-zh | 2017/07/26 14:11:42 | 1483 | - | 0.785820 | - | - | 0.785820 | 0.785820 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT 6 Ensembles * Bi-directional Reranking |
9 | NICT-5 | ja-zh | 2018/08/27 15:00:25 | 2175 | - | 0.785440 | - | - | 0.785440 | 0.785440 | - | - | 0.000000 | 0.000000 | NMT | No | Transformer vanilla model |
10 | Kyoto-U | ja-zh | 2017/08/01 14:17:43 | 1722 | - | 0.785420 | - | - | 0.785420 | 0.785420 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble of 5
shared BPE, averaged |
11 | NICT-5 | ja-zh | 2018/08/22 18:56:02 | 2055 | - | 0.784340 | - | - | 0.784340 | 0.784340 | - | - | 0.000000 | 0.000000 | NMT | No | Multi-layer-softmax for vanilla transformer. Train 6-layer model. Decode only using 3 layers. 2x faster than 6 layers. |
12 | Kyoto-U+ECNU | ja-zh | 2020/09/10 23:54:57 | 3676 | - | 0.783970 | - | - | 0.783970 | 0.783970 | - | - | - | - | NMT | No | forward-translation by using ja monolingual data from ASPEC-JE; lightconv (pay less attention) single model without ensemble |
13 | NICT-5 | ja-zh | 2018/09/10 14:09:18 | 2266 | - | 0.781410 | - | - | 0.781410 | 0.781410 | - | - | 0.000000 | 0.000000 | NMT | No | MLNMT |
14 | KNU_Hyundai | ja-zh | 2019/07/27 08:54:03 | 3170 | - | 0.781350 | - | - | 0.781350 | 0.781350 | - | - | - | - | NMT | Yes | Transformer(base) + *Used JPC corpus* with relative position, bt, r2l rerank, 4-model ensemble |
15 | NICT-2 | ja-zh | 2017/07/26 14:00:26 | 1478 | - | 0.779870 | - | - | 0.779870 | 0.779870 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder |
16 | Kyoto-U | ja-zh | 2017/07/31 15:24:48 | 1642 | - | 0.779400 | - | - | 0.779400 | 0.779400 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | KW replacement without KW in the test set, BPE, 6 ensemble |
17 | ORGANIZER | ja-zh | 2018/08/14 11:39:11 | 1903 | - | 0.777600 | - | - | 0.777600 | 0.777600 | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
18 | Kyoto-U | ja-zh | 2015/08/27 13:51:08 | 793 | 0.000000 | 0.768470 | 0.000000 | 0.000000 | 0.768470 | 0.768470 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT system with bilingual RNNLM reranking |
19 | Kyoto-U | ja-zh | 2015/08/26 02:17:25 | 778 | 0.000000 | 0.765440 | 0.000000 | 0.000000 | 0.765440 | 0.765440 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT system without reranking |
20 | Kyoto-U | ja-zh | 2015/07/03 11:01:45 | 457 | 0.000000 | 0.765320 | 0.000000 | 0.000000 | 0.765320 | 0.765320 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Kyoto-U team WAT2015 baseline with reranking |
21 | Kyoto-U | ja-zh | 2015/07/03 11:09:12 | 458 | 0.000000 | 0.764530 | 0.000000 | 0.000000 | 0.764530 | 0.764530 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Kyoto-U team WAT2015 baseline |
22 | Kyoto-U | ja-zh | 2016/08/05 23:26:20 | 1109 | - | 0.764230 | - | - | 0.764230 | 0.764230 | - | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT 2016 w/o reranking |
23 | NAIST | ja-zh | 2015/08/31 15:35:36 | 838 | 0.000000 | 0.763390 | 0.000000 | 0.000000 | 0.763390 | 0.763390 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System with NeuralMT Reranking |
24 | Kyoto-U | ja-zh | 2016/08/02 01:25:11 | 1071 | - | 0.763290 | - | - | 0.763290 | 0.763290 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | 2 layer lstm
dropout 0.5
200k source voc
unk replaced |
25 | Kyoto-U | ja-zh | 2015/07/31 00:35:46 | 545 | 0.000000 | 0.763020 | 0.000000 | 0.000000 | 0.763020 | 0.763020 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | added one reordering feature, w/ reranking |
26 | TOSHIBA | ja-zh | 2015/08/17 16:29:35 | 676 | 0.000000 | 0.762520 | 0.000000 | 0.000000 | 0.762520 | 0.762520 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | System combination SMT and RBMT(SPE) with RNNLM language model + post-processing |
27 | TOSHIBA | ja-zh | 2015/07/23 14:49:40 | 505 | 0.000000 | 0.762060 | 0.000000 | 0.000000 | 0.762060 | 0.762060 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | SPE(Statistical Post Editing) System |
28 | NAIST | ja-zh | 2014/08/01 17:18:51 | 122 | 0.000000 | 0.759740 | 0.000000 | 0.000000 | 0.759740 | 0.759740 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar-based Forest-to-String SMT System |
29 | NICT-2 | ja-zh | 2016/08/05 18:09:19 | 1105 | - | 0.759670 | - | - | 0.759670 | 0.759670 | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) |
30 | NAIST | ja-zh | 2014/08/01 17:27:20 | 123 | 0.000000 | 0.758480 | 0.000000 | 0.000000 | 0.758480 | 0.758480 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar-based Forest-to-String SMT System (Tuned BLEU+RIBES) |
31 | Kyoto-U | ja-zh | 2015/08/25 12:51:38 | 765 | 0.000000 | 0.757440 | 0.000000 | 0.000000 | 0.757440 | 0.757440 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | escaping w/ reranking |
32 | NAIST | ja-zh | 2015/08/31 15:38:17 | 839 | 0.000000 | 0.756990 | 0.000000 | 0.000000 | 0.756990 | 0.756990 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System Baseline |
33 | ORGANIZER | ja-zh | 2014/07/11 20:00:28 | 10 | 0.000000 | 0.755230 | 0.000000 | 0.000000 | 0.755230 | 0.755230 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | String-to-Tree SMT (2014) |
34 | ORGANIZER | ja-zh | 2015/09/10 14:12:41 | 881 | 0.000000 | 0.755230 | 0.000000 | 0.000000 | 0.755230 | 0.755230 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | String-to-Tree SMT (2015) |
35 | bjtu_nlp | ja-zh | 2016/08/09 14:48:19 | 1120 | - | 0.754690 | - | - | 0.754690 | 0.754690 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | RNN Encoder-Decoder with attention mechanism, single model |
36 | Kyoto-U | ja-zh | 2014/08/31 23:38:07 | 257 | 0.000000 | 0.754050 | 0.000000 | 0.000000 | 0.754050 | 0.754050 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our new baseline system after several modifications. |
37 | Kyoto-U | ja-zh | 2014/09/01 08:21:59 | 259 | 0.000000 | 0.751740 | 0.000000 | 0.000000 | 0.751740 | 0.751740 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking |
38 | WASUIPS | ja-zh | 2014/09/17 01:08:33 | 376 | 0.000000 | 0.750240 | 0.000000 | 0.000000 | 0.750240 | 0.750240 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: urheen and mecab, moses: 2.1.1). |
39 | WASUIPS | ja-zh | 2014/09/17 01:11:02 | 377 | 0.000000 | 0.750220 | 0.000000 | 0.000000 | 0.750220 | 0.750220 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 2.1.1). |
40 | WASUIPS | ja-zh | 2014/09/17 10:29:24 | 385 | 0.000000 | 0.749470 | 0.000000 | 0.000000 | 0.749470 | 0.749470 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: kytea, moses: 2.1.1). |
41 | ORGANIZER | ja-zh | 2014/07/11 19:50:50 | 7 | 0.000000 | 0.749450 | 0.000000 | 0.000000 | 0.749450 | 0.749450 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT |
42 | WASUIPS | ja-zh | 2014/09/17 10:32:13 | 386 | 0.000000 | 0.748360 | 0.000000 | 0.000000 | 0.748360 | 0.748360 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 2.1.1). |
43 | WASUIPS | ja-zh | 2014/09/17 12:07:07 | 390 | 0.000000 | 0.747890 | 0.000000 | 0.000000 | 0.747890 | 0.747890 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: stanford-ctb and juman, moses: 2.1.1). |
44 | Kyoto-U | ja-zh | 2014/07/14 14:30:39 | 18 | 0.000000 | 0.747090 | 0.000000 | 0.000000 | 0.747090 | 0.747090 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our baseline system. |
45 | Sense | ja-zh | 2014/08/26 15:19:02 | 201 | 0.000000 | 0.746750 | 0.000000 | 0.000000 | 0.746750 | 0.746750 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Character based SMT |
46 | TOSHIBA | ja-zh | 2014/08/29 18:06:20 | 238 | 0.000000 | 0.746000 | 0.000000 | 0.000000 | 0.746000 | 0.746000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with SPE(Statistical Post Editing) system |
47 | NICT | ja-zh | 2014/09/01 09:23:36 | 260 | 0.000000 | 0.745980 | 0.000000 | 0.000000 | 0.745980 | 0.745980 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Pre-reordering for phrase-based SMT (dependency parsing + manual rules) |
48 | ORGANIZER | ja-zh | 2014/07/11 19:45:54 | 3 | 0.000000 | 0.745100 | 0.000000 | 0.000000 | 0.745100 | 0.745100 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Hierarchical Phrase-based SMT (2014) |
49 | WASUIPS | ja-zh | 2014/09/17 00:54:35 | 373 | 0.000000 | 0.744150 | 0.000000 | 0.000000 | 0.744150 | 0.744150 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 1.0). |
50 | WASUIPS | ja-zh | 2014/09/17 12:04:30 | 389 | 0.000000 | 0.741490 | 0.000000 | 0.000000 | 0.741490 | 0.741490 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: stanford-ctb and juman, moses: 2.1.1). |
51 | WASUIPS | ja-zh | 2014/09/17 00:47:46 | 371 | 0.000000 | 0.728650 | 0.000000 | 0.000000 | 0.728650 | 0.728650 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: urheen and mecab, moses: 1.0). |
52 | WASUIPS | ja-zh | 2014/09/17 10:15:13 | 381 | 0.000000 | 0.727920 | 0.000000 | 0.000000 | 0.727920 | 0.727920 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: kytea, moses: 1.0). |
53 | BJTUNLP | ja-zh | 2014/08/28 20:02:56 | 224 | 0.000000 | 0.727700 | 0.000000 | 0.000000 | 0.727700 | 0.727700 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | |
54 | WASUIPS | ja-zh | 2014/09/17 10:17:52 | 382 | 0.000000 | 0.725500 | 0.000000 | 0.000000 | 0.725500 | 0.725500 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 1.0). |
55 | TMU | ja-zh | 2017/08/03 01:02:47 | 1743 | - | 0.700030 | - | - | 0.700030 | 0.700030 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | JP-CN reconstructor baseline |
56 | TOSHIBA | ja-zh | 2014/08/29 17:59:06 | 236 | 0.000000 | 0.685380 | 0.000000 | 0.000000 | 0.685380 | 0.685380 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | Yes | RBMT system |
57 | ORGANIZER | ja-zh | 2016/11/16 10:58:30 | 1336 | - | 0.673730 | - | - | 0.673730 | 0.673730 | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | Online D (2016/11/14) |
58 | ORGANIZER | ja-zh | 2014/08/29 18:51:05 | 243 | 0.000000 | 0.667960 | 0.000000 | 0.000000 | 0.667960 | 0.667960 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | No | RBMT B (2014) |
59 | ORGANIZER | ja-zh | 2015/09/10 14:32:38 | 886 | 0.000000 | 0.667960 | 0.000000 | 0.000000 | 0.667960 | 0.667960 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT B (2015) |
60 | ORGANIZER | ja-zh | 2016/07/26 12:18:34 | 1045 | - | 0.639440 | - | - | 0.639440 | 0.639440 | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online D (2016) |
61 | ORGANIZER | ja-zh | 2015/08/25 18:59:20 | 777 | 0.000000 | 0.634090 | 0.000000 | 0.000000 | 0.634090 | 0.634090 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online D (2015) |
62 | ORGANIZER | ja-zh | 2014/07/18 11:10:37 | 37 | 0.000000 | 0.625430 | 0.000000 | 0.000000 | 0.625430 | 0.625430 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online D (2014) |
63 | ORGANIZER | ja-zh | 2014/08/29 18:53:46 | 244 | 0.000000 | 0.594900 | 0.000000 | 0.000000 | 0.594900 | 0.594900 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | No | RBMT C |
64 | ORGANIZER | ja-zh | 2014/08/28 12:11:11 | 216 | 0.000000 | 0.587820 | 0.000000 | 0.000000 | 0.587820 | 0.587820 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online C (2014) |
65 | ORGANIZER | ja-zh | 2015/09/11 10:11:23 | 891 | 0.000000 | 0.566060 | 0.000000 | 0.000000 | 0.566060 | 0.566060 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online C (2015) |
66 | TMU | ja-zh | 2018/09/19 10:58:57 | 2505 | - | 0.545630 | - | - | 0.545630 | 0.545630 | - | - | 0.000000 | 0.000000 | NMT | Yes | Unsupervised NMT using Sub-character level information. JPO patent data was used as monolingual data in the training process. |