# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | JPCN3en-ja | 2018/08/15 12:49:22 | 1937 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
2 | EHR | JPCN3en-ja | 2018/09/08 20:37:52 | 2250 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | SMT reranked NMT |
3 | EHR | JPCN3en-ja | 2018/09/13 12:53:28 | 2286 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | SMT reranked NMT
(4M traning data from WAT and NTCIR, Epoch 13) |
4 | EHR | JPCN3en-ja | 2018/09/15 15:55:42 | 2397 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | SMT reranked NMT
(4M traning data from WAT and NTCIR, Epoch 18)
|
5 | EHR | JPCN3en-ja | 2018/09/16 15:12:04 | 2478 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | SMT reranked NMT
(4M traning data from WAT and NTCIR, Epoch 20)
|
6 | sarah | JPCN3en-ja | 2019/07/26 11:26:55 | 2975 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
7 | KNU_Hyundai | JPCN3en-ja | 2019/07/27 12:09:01 | 3187 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer Base, relative position, BT, r2l reranking, checkpoint ensemble |
8 | KNU_Hyundai | JPCN3en-ja | 2019/07/27 12:53:49 | 3195 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | Yes | Transformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble |
9 | goku20 | JPCN3en-ja | 2020/09/21 12:28:46 | 4103 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
10 | goku20 | JPCN3en-ja | 2020/09/22 00:19:09 | 4116 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
11 | tpt_wat | JPCN3en-ja | 2021/04/27 02:29:11 | 5708 | 0.893443 | 0.893443 | 0.893443 | - | - | - | - | - | - | - | NMT | No | Base Transformer model with joint vocab, size 8k |
12 | Bering Lab | JPCN3en-ja | 2021/05/04 20:03:17 | 6375 | 0.896949 | 0.896949 | 0.896949 | - | - | - | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
13 | sakura | JPCN3en-ja | 2024/08/08 19:53:13 | 7278 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
|
14 | sakura | JPCN3en-ja | 2024/08/09 00:52:14 | 7308 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
|