# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ryan | JPCN2ja-zh | 2019/07/25 22:06:17 | 2951 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | Base Transformer |
2 | sarah | JPCN2ja-zh | 2019/07/26 11:40:15 | 2983 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
3 | KNU_Hyundai | JPCN2ja-zh | 2019/07/27 08:39:21 | 3165 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | Yes | Transformer(base) + *Used ASPEC corpus* with relative position, bt, r2l rerank, 4-model ensemble(9-check point) |
4 | goku20 | JPCN2ja-zh | 2020/09/21 12:09:42 | 4090 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
5 | goku20 | JPCN2ja-zh | 2020/09/22 00:10:38 | 4110 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
6 | goku20 | JPCN2ja-zh | 2020/09/22 00:12:52 | 4111 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
7 | tpt_wat | JPCN2ja-zh | 2021/04/27 01:50:28 | 5695 | - | 0.888029 | - | - | 0.888029 | 0.888029 | - | - | - | - | SMT | No | Base Transformer Model with separate vocabularies, 8k size |
8 | tpt_wat | JPCN2ja-zh | 2021/04/27 01:51:04 | 5696 | - | 0.888029 | - | - | 0.888029 | 0.888029 | - | - | - | - | SMT | No | Base Transformer Model with separate vocabularies, 8k size |
9 | Bering Lab | JPCN2ja-zh | 2021/04/30 12:39:49 | 5848 | - | 0.893837 | - | - | 0.893837 | 0.893837 | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
10 | sakura | JPCN2ja-zh | 2024/08/09 00:42:32 | 7304 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best |
11 | sakura | JPCN2ja-zh | 2024/08/09 00:44:49 | 7305 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) |
12 | ORGANIZER | JPCN2ja-zh | 2018/08/15 18:23:17 | 1961 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
13 | USTC | JPCN2ja-zh | 2018/08/31 17:00:53 | 2203 | - | 0.000000 | - | - | 0.000000 | 0.000000 | - | - | 0.000000 | 0.000000 | NMT | No | tensor2tensor, 4 model average, r2l rerank |