# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | sarah | JPCN3ja-en | 2019/07/26 11:19:52 | 2971 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
2 | KNU_Hyundai | JPCN3ja-en | 2019/07/27 12:33:32 | 3191 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | Yes | Transformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble |
3 | goku20 | JPCN3ja-en | 2020/09/21 12:30:49 | 4104 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
4 | goku20 | JPCN3ja-en | 2020/09/22 00:19:58 | 4117 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
5 | tpt_wat | JPCN3ja-en | 2021/04/27 02:34:43 | 5712 | - | - | - | 0.599098 | - | - | - | - | - | - | NMT | No | Base Transformer model with separate vocab, size 8k |
6 | Bering Lab | JPCN3ja-en | 2021/04/28 13:38:13 | 5739 | - | - | - | 0.600990 | - | - | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
7 | sakura | JPCN3ja-en | 2024/08/08 19:56:51 | 7283 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
|
8 | sakura | JPCN3ja-en | 2024/08/09 00:54:09 | 7309 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
|
9 | ORGANIZER | JPCN3ja-en | 2018/08/15 12:58:46 | 1939 | - | - | - | 0.000000 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |