# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | JPCzh-ja | 2016/07/15 11:22:35 | 998 | 0.728520 | 0.728520 | 0.728520 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Tree-to-String SMT (2016) |
2 | EHR | JPCzh-ja | 2016/07/18 15:25:53 | 1007 | 0.745080 | 0.745080 | 0.745080 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Combination of word-based PBSMT and character-based PBSMT with DL=6. |
3 | EHR | JPCzh-ja | 2016/07/18 15:33:03 | 1009 | 0.735010 | 0.735010 | 0.735010 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | Combination of word-based PBSMT, character-based PBSMT and RBMT+PBSPE with DL=6. |
4 | ORGANIZER | JPCzh-ja | 2016/07/26 11:18:45 | 1040 | 0.693720 | 0.693720 | 0.693720 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2016) |
5 | NICT-2 | JPCzh-ja | 2016/08/04 17:34:38 | 1079 | 0.733020 | 0.733020 | 0.733020 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT with Preordering + Domain Adaptation |
6 | NICT-2 | JPCzh-ja | 2016/08/05 18:06:47 | 1100 | 0.739890 | 0.739890 | 0.739890 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM |
7 | bjtu_nlp | JPCzh-ja | 2016/08/09 18:44:56 | 1128 | 0.721460 | 0.721460 | 0.721460 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | RNN Encoder-Decoder with attention mechanism, single model |
8 | JAPIO | JPCzh-ja | 2016/08/17 11:48:56 | 1161 | 0.808090 | 0.808090 | 0.808090 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + JAPIO corpus including some sentences in testset |
9 | JAPIO | JPCzh-ja | 2016/08/18 14:15:46 | 1180 | 0.748330 | 0.748330 | 0.748330 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + JAPIO corpus |
10 | NTT | JPCzh-ja | 2016/08/19 08:26:18 | 1191 | 0.720260 | 0.720260 | 0.720260 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline PBMT (Moses) |
11 | JAPIO | JPCzh-ja | 2016/08/19 08:26:57 | 1192 | 0.751200 | 0.751200 | 0.751200 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + JAPIO corpus |
12 | NTT | JPCzh-ja | 2016/08/19 08:28:00 | 1193 | 0.730190 | 0.730190 | 0.730190 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | PBMT with pre-ordering on dependency structures |
13 | NTT | JPCzh-ja | 2016/08/19 08:53:34 | 1199 | 0.752200 | 0.752200 | 0.752200 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Baseline NMT with attention over bidirectional LSTMs (by Harvard NMT) |
14 | NTT | JPCzh-ja | 2016/08/19 08:55:20 | 1200 | 0.749270 | 0.749270 | 0.749270 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT with pre-ordering and attention over bidirectional LSTMs (pre-ordering module is the same as the PBMT submission) |
15 | Sense | JPCzh-ja | 2016/08/29 01:06:19 | 1281 | 0.718590 | 0.718590 | 0.718590 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Clustercat-C10-PBMT |
16 | Sense | JPCzh-ja | 2016/08/29 09:55:33 | 1284 | 0.715450 | 0.715450 | 0.715450 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-C10-PBMT |
17 | Sense | JPCzh-ja | 2016/08/29 23:08:29 | 1292 | 0.719330 | 0.719330 | 0.719330 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-C50-PBMT |
18 | Sense | JPCzh-ja | 2016/08/30 07:37:39 | 1294 | 0.715290 | 0.715290 | 0.715290 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Clustercat-C50-PBMT |
19 | WASUIPS | JPCzh-ja | 2016/10/12 21:04:52 | 1325 | 0.679110 | 0.679110 | 0.679110 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our baseline system: train=100,000 tune=1,000 test=2,000 BLEU (our PC): 32.29. |
20 | WASUIPS | JPCzh-ja | 2016/10/12 21:06:36 | 1326 | 0.686030 | 0.686030 | 0.686030 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Our improved system: train=100,000 tune=1,000 test=2,000 BLEU (our PC): 33.61. Using bilingual term extraction and re-tokenization for Chinese–Japanese. |
21 | JAPIO | JPCzh-ja | 2016/10/27 13:01:42 | 1329 | 0.733300 | 0.733300 | 0.733300 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT with Preordering |
22 | ORGANIZER | JPCzh-ja | 2016/11/16 11:19:58 | 1341 | 0.747240 | 0.747240 | 0.747240 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | Online A (2016/11/14) |
23 | EHR | JPCzh-ja | 2017/07/19 19:28:31 | 1408 | 0.756350 | 0.756350 | 0.756350 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | SMT reranked NMT (word based, by Moses and OpenNMT) |
24 | EHR | JPCzh-ja | 2017/07/19 19:35:03 | 1409 | 0.757130 | 0.757130 | 0.757130 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Simple NMT (character based, by OpenNMT) |
25 | EHR | JPCzh-ja | 2017/07/19 20:41:27 | 1414 | 0.761370 | 0.761370 | 0.761370 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | SMT reranked NMT (character based, by Moses and OpenNMT) |
26 | EHR | JPCzh-ja | 2017/07/19 20:45:00 | 1415 | 0.755900 | 0.755900 | 0.755900 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | Simple NMT (word based, by OpenNMT) |
27 | JAPIO | JPCzh-ja | 2017/07/25 12:22:07 | 1447 | 0.774660 | 0.774660 | 0.774660 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + Japio corpus |
28 | JAPIO | JPCzh-ja | 2017/07/25 18:26:52 | 1458 | 0.754970 | 0.754970 | 0.754970 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | OpenNMT(dbrnn) |
29 | u-tkb | JPCzh-ja | 2017/07/26 12:44:18 | 1468 | 0.729580 | 0.729580 | 0.729580 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT with SMT phrase translation
(phrase extraction with branching entropy;
attention over bidirectional LSTMs;
by Harvard NMT) |
30 | JAPIO | JPCzh-ja | 2017/07/26 14:09:22 | 1482 | 0.777460 | 0.777460 | 0.777460 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor |
31 | JAPIO | JPCzh-ja | 2017/07/26 14:21:18 | 1484 | 0.779420 | 0.779420 | 0.779420 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | Combination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor) |
32 | EHR | JPCzh-ja | 2018/05/04 14:17:39 | 1803 | 0.623300 | 0.623300 | 0.623300 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | RBMT | Yes | RBMT system for WAT2015's submission |
33 | ORGANIZER | JPCzh-ja | 2018/08/15 18:29:31 | 1963 | 0.761820 | 0.761820 | 0.761820 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
34 | USTC | JPCzh-ja | 2018/08/31 17:24:35 | 2206 | 0.771310 | 0.771310 | 0.771310 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | tensor2tensor, 4 model average, r2l rerank |
35 | EHR | JPCzh-ja | 2018/08/31 18:51:15 | 2210 | 0.764670 | 0.764670 | 0.764670 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | SMT reranked NMT |
36 | ryan | JPCzh-ja | 2019/07/25 22:12:26 | 2954 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Base Transformer |
37 | sarah | JPCzh-ja | 2019/07/26 11:32:28 | 2976 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
38 | KNU_Hyundai | JPCzh-ja | 2019/07/27 08:29:23 | 3153 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | Yes | Transformer(base) + *Used ASPEC corpus* with relative position, bt, multi source, r2l rerank, 5-model ensemble |
39 | goku20 | JPCzh-ja | 2020/09/21 11:54:22 | 4087 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
40 | goku20 | JPCzh-ja | 2020/09/22 00:04:26 | 4105 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
41 | tpt_wat | JPCzh-ja | 2021/04/27 01:42:09 | 5690 | 0.886918 | 0.886918 | 0.886918 | - | - | - | - | - | - | - | NMT | No | Base Transformer model with shared vocab 8k size |
42 | Bering Lab | JPCzh-ja | 2021/05/04 06:53:17 | 6171 | 0.896405 | 0.896405 | 0.896405 | - | - | - | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
43 | sakura | JPCzh-ja | 2024/08/08 19:35:56 | 7264 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
|
44 | sakura | JPCzh-ja | 2024/08/09 00:24:08 | 7296 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
|
45 | ORGANIZER | JPCzh-ja | 2015/05/14 17:55:51 | 430 | 0.729370 | 0.729370 | 0.729370 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Hierarchical Phrase-based SMT |
46 | ORGANIZER | JPCzh-ja | 2015/05/14 17:58:14 | 431 | 0.723110 | 0.723110 | 0.723110 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT |
47 | ORGANIZER | JPCzh-ja | 2015/05/14 18:00:16 | 432 | 0.725920 | 0.725920 | 0.725920 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Tree-to-String SMT (2015) |
48 | TOSHIBA | JPCzh-ja | 2015/07/23 14:43:30 | 504 | 0.740180 | 0.740180 | 0.740180 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | Combination of phrase-based SMT and SPE systems. |
49 | TOSHIBA | JPCzh-ja | 2015/07/28 16:30:41 | 526 | 0.741990 | 0.741990 | 0.741990 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with SPE(Statistical Post Editing) system |
50 | ORGANIZER | JPCzh-ja | 2015/08/14 16:52:02 | 647 | 0.693840 | 0.693840 | 0.693840 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2015) |
51 | ORGANIZER | JPCzh-ja | 2015/08/14 16:55:19 | 648 | 0.588380 | 0.588380 | 0.588380 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online B (2015) |
52 | TOSHIBA | JPCzh-ja | 2015/08/17 11:53:34 | 667 | 0.668780 | 0.668780 | 0.668780 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | Yes | RBMT |
53 | EHR | JPCzh-ja | 2015/08/17 14:05:20 | 671 | 0.721400 | 0.721400 | 0.721400 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | System combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering.
Candidate selection by language model score. |
54 | NTT | JPCzh-ja | 2015/08/21 08:07:18 | 736 | 0.732450 | 0.732450 | 0.732450 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | A pre-ordering-based PBMT with patent-tuned dependency parsing and phrase table smoothing. |
55 | ORGANIZER | JPCzh-ja | 2015/08/25 11:42:02 | 759 | 0.557130 | 0.557130 | 0.557130 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | No | RBMT A (2015) |
56 | ORGANIZER | JPCzh-ja | 2015/08/25 11:53:50 | 760 | 0.502100 | 0.502100 | 0.502100 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | RBMT | No | RBMT B |
57 | Kyoto-U | JPCzh-ja | 2015/08/26 13:10:44 | 781 | 0.731420 | 0.731420 | 0.731420 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | Baseline w/o reranking |
58 | NTT | JPCzh-ja | 2015/08/28 09:53:24 | 811 | 0.723200 | 0.723200 | 0.723200 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | A pre-ordering-based PBMT with patent-tuned dependency parsing, learning-based pre-ordering, and phrase table smoothing. |
59 | EHR | JPCzh-ja | 2015/08/30 12:42:52 | 828 | 0.701880 | 0.701880 | 0.701880 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT and RBMT | Yes | RBMT with user dictionary plus SPE |
60 | EHR | JPCzh-ja | 2015/08/30 15:22:25 | 830 | 0.706550 | 0.706550 | 0.706550 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase based SMT with preordering |
61 | WASUIPS | JPCzh-ja | 2015/09/01 14:16:16 | 853 | 0.709700 | 0.709700 | 0.709700 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | SMT | No | Combining sampling-based alignment and bilingual hierarchical sub-sentential alignment methods. |
62 | Kyoto-U | JPCzh-ja | 2015/09/02 09:25:04 | 864 | 0.744190 | 0.744190 | 0.744190 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT system with bilingual RNNLM reranking (only character-base model) |