# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | JPCja-zh | 2016/07/13 15:11:50 | 966 | - | 0.710940 | - | - | 0.710940 | 0.710940 | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT |
2 | ORGANIZER | JPCja-zh | 2016/07/13 15:16:28 | 967 | - | 0.718360 | - | - | 0.718360 | 0.718360 | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Hierarchical Phrase-based SMT |
3 | ORGANIZER | JPCja-zh | 2016/07/13 15:31:58 | 968 | - | 0.720030 | - | - | 0.720030 | 0.720030 | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | String-to-Tree SMT |
4 | ORGANIZER | JPCja-zh | 2016/07/26 10:35:57 | 1038 | - | 0.702350 | - | - | 0.702350 | 0.702350 | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2016) |
5 | ORGANIZER | JPCja-zh | 2016/08/01 18:33:20 | 1069 | - | 0.527180 | - | - | 0.527180 | 0.527180 | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online B (2016) |
6 | NICT-2 | JPCja-zh | 2016/08/04 17:37:25 | 1081 | - | 0.723270 | - | - | 0.723270 | 0.723270 | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT with Preordering + Domain Adaptation |
7 | NICT-2 | JPCja-zh | 2016/08/05 18:13:16 | 1106 | - | 0.731520 | - | - | 0.731520 | 0.731520 | - | 0.000000 | 0.000000 | 0.000000 | SMT | Yes | Phrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) |
8 | ORGANIZER | JPCja-zh | 2016/08/08 17:57:51 | 1118 | - | 0.475430 | - | - | 0.475430 | 0.475430 | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT C (2016) |
9 | bjtu_nlp | JPCja-zh | 2016/08/16 12:45:01 | 1150 | - | 0.701490 | - | - | 0.701490 | 0.701490 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | RNN Encoder-Decoder with attention mechanism, single model |
10 | Sense | JPCja-zh | 2016/08/29 01:08:54 | 1282 | - | 0.709390 | - | - | 0.709390 | 0.709390 | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Clustercat-C10-PBMT |
11 | Sense | JPCja-zh | 2016/08/29 09:52:57 | 1283 | - | 0.707590 | - | - | 0.707590 | 0.707590 | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-C10-PBMT |
12 | Sense | JPCja-zh | 2016/08/29 23:10:28 | 1293 | - | 0.710030 | - | - | 0.710030 | 0.710030 | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Baseline-C50-PBMT |
13 | Sense | JPCja-zh | 2016/08/30 08:15:19 | 1295 | - | 0.710150 | - | - | 0.710150 | 0.710150 | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Clustercat-C50-PBMT |
14 | ORGANIZER | JPCja-zh | 2016/11/16 11:16:25 | 1340 | - | 0.735470 | - | - | 0.735470 | 0.735470 | - | 0.000000 | 0.000000 | 0.000000 | NMT | Yes | Online A (2016/11/14) |
15 | u-tkb | JPCja-zh | 2017/07/26 12:33:41 | 1465 | - | 0.706720 | - | - | 0.706720 | 0.706720 | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT with SMT phrase translation
(phrase extraction with branching entropy;
attention over bidirectional LSTMs;
by Harvard NMT) |
16 | ORGANIZER | JPCja-zh | 2018/08/15 18:14:01 | 1960 | - | 0.752360 | - | - | 0.752360 | 0.752360 | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
17 | USTC | JPCja-zh | 2018/08/31 16:54:46 | 2202 | - | 0.757690 | - | - | 0.757690 | 0.757690 | - | - | 0.000000 | 0.000000 | NMT | No | tensor2tensor, 4 model average, r2l rerank |
18 | ryan | JPCja-zh | 2019/07/25 22:04:25 | 2950 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | Base Transformer |
19 | sarah | JPCja-zh | 2019/07/26 11:39:52 | 2982 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
20 | KNU_Hyundai | JPCja-zh | 2019/07/27 08:37:11 | 3162 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | Yes | Transformer(base) + *Used ASPEC corpus* with relative position, bt, r2l rerank, 4-model ensemble(9-check point) |
21 | goku20 | JPCja-zh | 2020/09/21 11:55:29 | 4088 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
22 | goku20 | JPCja-zh | 2020/09/22 00:06:06 | 4106 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
23 | tpt_wat | JPCja-zh | 2021/04/27 01:49:08 | 5694 | - | 0.885218 | - | - | 0.885218 | 0.885218 | - | - | - | - | NMT | No | Base Transformer model with separated vocabs 8k size |
24 | Bering Lab | JPCja-zh | 2021/04/30 14:56:11 | 5853 | - | 0.899221 | - | - | 0.899221 | 0.899221 | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
25 | sakura | JPCja-zh | 2024/08/08 19:34:30 | 7263 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
|
26 | sakura | JPCja-zh | 2024/08/09 00:26:12 | 7297 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
|