# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | JAPIO | JPCja-en | 2017/07/28 22:22:15 | 1574 | - | - | - | 0.724710 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | Combination of 3 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor) |
2 | ORGANIZER | JPCja-en | 2016/11/16 11:06:50 | 1338 | - | - | - | 0.722590 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | Online A (2016/11/14) |
3 | JAPIO | JPCja-en | 2017/07/29 10:49:01 | 1578 | - | - | - | 0.715560 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor |
4 | JAPIO | JPCja-en | 2017/07/25 18:17:30 | 1455 | - | - | - | 0.699930 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | OpenNMT(dbrnn) |
5 | ORGANIZER | JPCja-en | 2018/08/15 18:38:51 | 1965 | - | - | - | 0.699460 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
6 | u-tkb | JPCja-en | 2017/07/26 12:53:50 | 1472 | - | - | - | 0.697290 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT with SMT phrase translation
(phrase extraction with branching entropy;
attention over bidirectional LSTMs;
by Harvard NMT) |
7 | bjtu_nlp | JPCja-en | 2016/08/16 12:34:36 | 1149 | - | - | - | 0.690750 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | RNN Encoder-Decoder with attention mechanism, single model |
8 | CUNI | JPCja-en | 2017/07/31 22:34:51 | 1666 | - | - | - | 0.681520 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Bahdanau (2014) seq2seq with conditional GRU on byte-pair encoding |
9 | ORGANIZER | JPCja-en | 2016/07/26 10:15:25 | 1035 | - | - | - | 0.673950 | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2016) |
10 | NICT-2 | JPCja-en | 2016/08/05 17:58:40 | 1103 | - | - | - | 0.672890 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM |
11 | ORGANIZER | JPCja-en | 2016/07/13 17:12:09 | 980 | - | - | - | 0.672760 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | String-to-Tree SMT |
12 | ORGANIZER | JPCja-en | 2016/07/13 17:00:31 | 979 | - | - | - | 0.672500 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Hierarchical Phrase-based SMT |
13 | Kyoto-U | JPCja-en | 2016/07/27 17:15:10 | 1057 | - | - | - | 0.672040 | - | - | - | 0.000000 | 0.000000 | 0.000000 | EBMT | No | KyotoEBMT 2016 w/o reranking |
14 | NICT-2 | JPCja-en | 2016/08/04 17:26:27 | 1080 | - | - | - | 0.667540 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT with Preordering + Domain Adaptation |
15 | ORGANIZER | JPCja-en | 2016/07/13 16:54:09 | 977 | - | - | - | 0.664830 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT |
16 | Bering Lab | JPCja-en | 2021/04/23 13:05:24 | 5419 | - | - | - | 0.576681 | - | - | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
17 | tpt_wat | JPCja-en | 2021/04/27 02:32:27 | 5710 | - | - | - | 0.569295 | - | - | - | - | - | - | NMT | No | Base Transformer model with separate vocab, size 8k |
18 | ORGANIZER | JPCja-en | 2016/08/05 15:18:46 | 1090 | - | - | - | 0.521230 | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT A (2016) |
19 | ORGANIZER | JPCja-en | 2016/08/05 14:51:27 | 1088 | - | - | - | 0.519210 | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT C (2016) |
20 | ORGANIZER | JPCja-en | 2016/08/05 15:59:14 | 1095 | - | - | - | 0.518110 | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT B (2016) |
21 | ORGANIZER | JPCja-en | 2016/07/26 13:43:21 | 1051 | - | - | - | 0.486450 | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online B (2016) |
22 | sarah | JPCja-en | 2019/07/26 11:18:48 | 2969 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
23 | KNU_Hyundai | JPCja-en | 2019/07/27 12:27:32 | 3189 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | Yes | Transformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble |
24 | goku20 | JPCja-en | 2020/09/21 12:15:47 | 4094 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
25 | goku20 | JPCja-en | 2020/09/22 00:08:48 | 4108 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
26 | sakura | JPCja-en | 2024/08/08 19:55:46 | 7281 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
|
27 | sakura | JPCja-en | 2024/08/09 00:16:39 | 7293 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) |