# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | JPCN2ja-en | 2018/08/15 12:50:42 | 1938 | - | - | - | 0.000000 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
2 | sarah | JPCN2ja-en | 2019/07/26 11:19:20 | 2970 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
3 | KNU_Hyundai | JPCN2ja-en | 2019/07/27 12:32:52 | 3190 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | Yes | Transformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble |
4 | goku20 | JPCN2ja-en | 2020/09/21 12:21:24 | 4098 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
5 | goku20 | JPCN2ja-en | 2020/09/22 00:15:36 | 4113 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
6 | tpt_wat | JPCN2ja-en | 2021/04/27 02:34:04 | 5711 | - | - | - | 0.572779 | - | - | - | - | - | - | NMT | No | Base Transformer model with separate vocab, size 8k |
7 | Bering Lab | JPCN2ja-en | 2021/04/28 12:09:55 | 5737 | - | - | - | 0.576823 | - | - | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
8 | sakura | JPCN2ja-en | 2024/08/08 19:56:22 | 7282 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
|
9 | sakura | JPCN2ja-en | 2024/08/09 00:33:19 | 7300 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) |