# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | Kyoto-U+ECNU | zh-ja | 2020/09/17 18:41:34 | 3813 | 0.823390 | 0.823390 | 0.823390 | - | - | - | - | - | - | - | NMT | Yes | ensemble 9 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder) |
2 | Kyoto-U+ECNU | zh-ja | 2020/09/18 17:38:55 | 3933 | 0.821660 | 0.821660 | 0.821660 | - | - | - | - | - | - | - | NMT | No | without out-of-domain parallel data; others same as DataID:3813 |
3 | srcb | zh-ja | 2019/07/27 15:48:24 | 3210 | 0.819020 | 0.819020 | 0.819020 | - | - | - | - | - | - | - | NMT | No | Transformer(Big) with relative position, sentence-wise smooth, deep transformer, back translation, ensemble of 7 models. |
4 | Kyoto-U+ECNU | zh-ja | 2020/09/10 23:58:13 | 3677 | 0.817610 | 0.817610 | 0.817610 | - | - | - | - | - | - | - | NMT | No | back-translation by using ja monolingual data from ASPEC-JE; lightconv (pay less attention) single model without ensemble |
5 | srcb | zh-ja | 2019/07/25 11:37:44 | 2917 | 0.812450 | 0.812450 | 0.812450 | - | - | - | - | - | - | - | NMT | No | Transformer (Big) with relative position, layer attention, sentence-wise smooth. |
6 | KNU_Hyundai | zh-ja | 2019/07/27 10:30:04 | 3179 | 0.809990 | 0.809990 | 0.809990 | - | - | - | - | - | - | - | NMT | No | Transformer(base) + *Used ASPEC ja-en corpus* with relative position, bt, multi source, r2l rerank, 6-model ensemble |
7 | NICT-5 | zh-ja | 2018/08/27 14:40:35 | 2169 | 0.805750 | 0.805750 | 0.805750 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Combining En-Ja corpus with Zh-Ja as a multilingual model.
*ADDITIONAL ASPEC CORPUS USED* |
8 | NICT-5 | zh-ja | 2018/09/10 14:14:05 | 2267 | 0.804920 | 0.804920 | 0.804920 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | MLNMT |
9 | NICT-5 | zh-ja | 2018/08/22 18:51:44 | 2052 | 0.800670 | 0.800670 | 0.800670 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Mixed fine tuning by first pretraining on En-Ja ASPEC data and then continue on the En-Ja+Zh-Ja data. Transformer. |
10 | Kyoto-U | zh-ja | 2017/08/01 14:14:49 | 1720 | 0.799840 | 0.799840 | 0.799840 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble of 7
shared BPE, averaged |
11 | NICT-2 | zh-ja | 2017/07/26 14:08:45 | 1481 | 0.799680 | 0.799680 | 0.799680 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT 6 Ensembles * Bi-directional Reranking |
12 | Kyoto-U | zh-ja | 2017/07/29 08:02:07 | 1577 | 0.799520 | 0.799520 | 0.799520 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble of 5
Shared BPE 40k |
13 | ORGANIZER | zh-ja | 2017/08/02 09:59:33 | 1740 | 0.798110 | 0.798110 | 0.798110 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Google's "Attention Is All You Need" |
14 | Kyoto-U | zh-ja | 2017/07/31 15:27:21 | 1643 | 0.793410 | 0.793410 | 0.793410 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | KW replacement without KW in the test set, BPE, 6 ensemble |
15 | NICT-2 | zh-ja | 2017/07/26 13:58:44 | 1477 | 0.788940 | 0.788940 | 0.788940 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder |
16 | Kyoto-U | zh-ja | 2016/10/11 10:46:03 | 1324 | 0.787930 | 0.787930 | 0.787930 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | voc: 32k ensemble of 4 independent model + Chinese short unit |
17 | Kyoto-U | zh-ja | 2016/08/20 22:50:33 | 1256 | 0.785910 | 0.785910 | 0.785910 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | voc: 30k
ensemble of 3 independent model
+ reverse rescoring |
18 | Kyoto-U | zh-ja | 2016/08/20 22:48:16 | 1255 | 0.784380 | 0.784380 | 0.784380 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | src: 200k
tgt: 50k
2-layers
self-ensembling |
19 | ORGANIZER | zh-ja | 2018/08/14 11:33:03 | 1902 | 0.782100 | 0.782100 | 0.782100 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
20 | NAIST | zh-ja | 2015/08/31 08:23:30 | 834 | 0.771010 | 0.771010 | 0.771010 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System with NeuralMT Reranking |
21 | Kyoto-U | zh-ja | 2015/08/31 22:39:36 | 845 | 0.769700 | 0.769700 | 0.769700 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT system with bilingual RNNLM reranking |
22 | EHR | zh-ja | 2016/07/31 17:06:57 | 1063 | 0.769490 | 0.769490 | 0.769490 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | LM-based merging of outputs of preordered word-based PBSMT(DL=6) and preordered character-based PBSMT(DL=6). |
23 | NICT-2 | zh-ja | 2016/08/05 18:05:03 | 1099 | 0.768580 | 0.768580 | 0.768580 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM |
24 | NAIST | zh-ja | 2014/07/31 11:42:31 | 120 | 0.768190 | 0.768190 | 0.768190 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar-based Forest-to-String SMT System |
25 | Kyoto-U | zh-ja | 2016/08/07 18:28:23 | 1110 | 0.767120 | 0.767120 | 0.767120 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT 2016 w/o reranking |
26 | NAIST | zh-ja | 2014/08/01 17:33:01 | 124 | 0.766270 | 0.766270 | 0.766270 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar-based Forest-to-String SMT System (Tuned BLEU+RIBES) |
27 | SAS_MT | zh-ja | 2014/09/01 10:38:13 | 263 | 0.765730 | 0.765730 | 0.765730 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Syntactic reordering Hierarchical SMT (using SAS token tool) |
28 | UT-KAY | zh-ja | 2016/08/20 07:12:52 | 1221 | 0.765530 | 0.765530 | 0.765530 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Ensemble of our NMT models with and without domain adaptation |
29 | EHR | zh-ja | 2015/08/19 11:23:36 | 720 | 0.765050 | 0.765050 | 0.765050 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | System combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering.
Candidate selection by language model score. |
30 | NAIST | zh-ja | 2015/08/31 08:26:31 | 835 | 0.764830 | 0.764830 | 0.764830 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Travatar System Baseline |
31 | Kyoto-U | zh-ja | 2015/08/07 13:24:55 | 597 | 0.762430 | 0.762430 | 0.762430 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Updated JUMAN and added one reordering feature, w/ reranking |
32 | Kyoto-U | zh-ja | 2015/07/17 09:04:22 | 491 | 0.762180 | 0.762180 | 0.762180 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | WAT2015 baseline with reranking |
33 | Kyoto-U | zh-ja | 2015/08/31 22:38:22 | 844 | 0.761960 | 0.761960 | 0.761960 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT system without reranking |
34 | bjtu_nlp | zh-ja | 2016/08/12 12:50:38 | 1138 | 0.760840 | 0.760840 | 0.760840 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | RNN Encoder-Decoder with attention mechanism, single model |
35 | TOSHIBA | zh-ja | 2015/07/28 16:27:32 | 525 | 0.758110 | 0.758110 | 0.758110 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with SPE(Statistical Post Editing) system |
36 | Kyoto-U | zh-ja | 2014/09/01 21:33:23 | 268 | 0.757610 | 0.757610 | 0.757610 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking |
37 | Kyoto-U | zh-ja | 2015/07/17 09:01:42 | 490 | 0.757070 | 0.757070 | 0.757070 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | WAT2015 baseline |
38 | ORGANIZER | zh-ja | 2014/07/11 20:04:10 | 13 | 0.754870 | 0.754870 | 0.754870 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Tree-to-String SMT (2014) |
39 | ORGANIZER | zh-ja | 2015/09/10 14:00:33 | 879 | 0.754870 | 0.754870 | 0.754870 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Tree-to-String SMT (2015) |
40 | EHR | zh-ja | 2015/09/04 11:44:26 | 868 | 0.754180 | 0.754180 | 0.754180 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with user dictionary plus SPE. |
41 | UT-KAY | zh-ja | 2016/08/20 07:09:54 | 1220 | 0.753820 | 0.753820 | 0.753820 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | An end-to-end NMT with 512 dimensional single-layer LSTMs, UNK replacement, and domain adaptation |
42 | WASUIPS | zh-ja | 2014/09/17 10:24:50 | 383 | 0.753750 | 0.753750 | 0.753750 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: kytea, moses: 2.1.1). |
43 | WASUIPS | zh-ja | 2014/09/17 10:26:43 | 384 | 0.753690 | 0.753690 | 0.753690 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 2.1.1). |
44 | ORGANIZER | zh-ja | 2014/07/11 19:54:58 | 8 | 0.753010 | 0.753010 | 0.753010 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT |
45 | Sense | zh-ja | 2014/08/26 15:17:49 | 200 | 0.752890 | 0.752890 | 0.752890 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Character based SMT |
46 | TOSHIBA | zh-ja | 2015/07/23 15:14:53 | 508 | 0.752830 | 0.752830 | 0.752830 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | System combination SMT and RBMT(SPE) with RNNLM language model |
47 | SAS_MT | zh-ja | 2014/08/29 15:33:07 | 232 | 0.752170 | 0.752170 | 0.752170 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Syntactic reordering phrase-based SMT (SAS token tool) |
48 | ORGANIZER | zh-ja | 2014/07/11 19:47:27 | 4 | 0.750950 | 0.750950 | 0.750950 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Hierarchical Phrase-based SMT (2014) |
49 | Kyoto-U | zh-ja | 2014/08/31 23:42:41 | 258 | 0.750370 | 0.750370 | 0.750370 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our new baseline system after several modifications. |
50 | Kyoto-U | zh-ja | 2014/08/19 09:31:08 | 133 | 0.750310 | 0.750310 | 0.750310 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Using n-best parses and RNNLM. |
51 | Kyoto-U | zh-ja | 2014/08/19 10:21:37 | 135 | 0.748200 | 0.748200 | 0.748200 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Our baseline system. |
52 | BJTUNLP | zh-ja | 2015/08/25 14:55:20 | 769 | 0.744130 | 0.744130 | 0.744130 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | |
53 | BJTUNLP | zh-ja | 2015/09/01 21:08:10 | 862 | 0.744130 | 0.744130 | 0.744130 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | a dependency-to-string model for SMT |
54 | WASUIPS | zh-ja | 2014/09/17 12:00:46 | 388 | 0.744040 | 0.744040 | 0.744040 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: stanford-ctb and juman, moses: 2.1.1). |
55 | WASUIPS | zh-ja | 2014/09/17 11:03:46 | 387 | 0.743140 | 0.743140 | 0.743140 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: stanford-ctb and juman, moses: 2.1.1). |
56 | WASUIPS | zh-ja | 2014/09/17 01:03:57 | 374 | 0.740650 | 0.740650 | 0.740650 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: urheen and mecab, moses: 2.1.1). |
57 | WASUIPS | zh-ja | 2014/09/17 01:05:38 | 375 | 0.740640 | 0.740640 | 0.740640 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 2.1.1). |
58 | WASUIPS | zh-ja | 2014/09/17 00:46:07 | 370 | 0.734620 | 0.734620 | 0.734620 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 1.0). |
59 | Sense | zh-ja | 2015/07/29 07:20:20 | 533 | 0.733190 | 0.733190 | 0.733190 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-2015 |
60 | WASUIPS | zh-ja | 2014/09/17 10:07:44 | 379 | 0.725360 | 0.725360 | 0.725360 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: kytea, moses: 1.0). |
61 | WASUIPS | zh-ja | 2014/09/17 10:10:47 | 380 | 0.725250 | 0.725250 | 0.725250 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Our baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 1.0). |
62 | WASUIPS | zh-ja | 2014/09/17 00:43:38 | 369 | 0.711650 | 0.711650 | 0.711650 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system (segmentation tools: urheen and mecab, moses: 1.0). |
63 | EHR | zh-ja | 2015/09/02 17:00:16 | 867 | 0.707310 | 0.707310 | 0.707310 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase based SMT with preordering. |
64 | JAPIO | zh-ja | 2016/08/19 16:44:49 | 1208 | 0.696770 | 0.696770 | 0.696770 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + JAPIO corpus + rule-based posteditor |
65 | EIWA | zh-ja | 2014/08/20 11:56:00 | 138 | 0.693330 | 0.693330 | 0.693330 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with user dictionary plus SPE(statistical post editing) |
66 | ORGANIZER | zh-ja | 2016/11/16 11:28:00 | 1342 | 0.692820 | 0.692820 | 0.692820 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | Online A (2016/11/14) |
67 | ORGANIZER | zh-ja | 2016/07/26 11:54:14 | 1043 | 0.659540 | 0.659540 | 0.659540 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2016) |
68 | ORGANIZER | zh-ja | 2014/07/18 11:09:12 | 36 | 0.658060 | 0.658060 | 0.658060 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2014) |
69 | TOSHIBA | zh-ja | 2015/08/17 12:11:52 | 669 | 0.654080 | 0.654080 | 0.654080 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | Yes | RBMT |
70 | ORGANIZER | zh-ja | 2015/08/25 18:58:08 | 776 | 0.649860 | 0.649860 | 0.649860 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2015) |
71 | ORGANIZER | zh-ja | 2014/08/28 12:10:13 | 215 | 0.636930 | 0.636930 | 0.636930 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online B (2014) |
72 | ORGANIZER | zh-ja | 2015/09/11 10:09:23 | 890 | 0.628290 | 0.628290 | 0.628290 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online B (2015) |
73 | ORGANIZER | zh-ja | 2014/08/29 18:45:03 | 239 | 0.626070 | 0.626070 | 0.626070 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | No | RBMT A (2014) |
74 | ORGANIZER | zh-ja | 2015/09/10 14:30:56 | 885 | 0.626070 | 0.626070 | 0.626070 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT A (2015) |
75 | EIWA | zh-ja | 2014/08/20 11:52:45 | 137 | 0.613730 | 0.613730 | 0.613730 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | Yes | RBMT plus user dictionary |
76 | ORGANIZER | zh-ja | 2014/08/29 18:48:29 | 242 | 0.586790 | 0.586790 | 0.586790 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | No | RBMT D |
77 | TMU | zh-ja | 2018/09/14 17:30:33 | 2343 | 0.512430 | 0.512430 | 0.512430 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | Unsupervised NMT with sub-character information. Both ASPEC and JPC 4.0 data (zh-ja) were also used as monolingual data in the training. |