# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | JPCN2en-ja | 2018/08/15 12:47:37 | 1936 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
2 | EHR | JPCN2en-ja | 2018/09/08 20:34:49 | 2249 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | SMT reranked NMT |
3 | EHR | JPCN2en-ja | 2018/09/13 12:52:17 | 2285 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | SMT reranked NMT
(4M traning data from WAT and NTCIR, Epoch 13) |
4 | EHR | JPCN2en-ja | 2018/09/15 15:52:37 | 2396 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | SMT reranked NMT
(4M traning data from WAT and NTCIR, Epoch 18)
|
5 | EHR | JPCN2en-ja | 2018/09/16 15:10:44 | 2477 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | SMT reranked NMT
(4M traning data from WAT and NTCIR, Epoch 20)
|
6 | sarah | JPCN2en-ja | 2019/07/26 11:26:34 | 2974 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
7 | KNU_Hyundai | JPCN2en-ja | 2019/07/27 12:07:31 | 3186 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer Base, relative position, BT, r2l reranking, checkpoint ensemble |
8 | KNU_Hyundai | JPCN2en-ja | 2019/07/27 12:52:40 | 3194 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | Yes | Transformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble |
9 | goku20 | JPCN2en-ja | 2020/09/21 12:19:14 | 4097 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
10 | goku20 | JPCN2en-ja | 2020/09/22 00:14:35 | 4112 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
11 | tpt_wat | JPCN2en-ja | 2021/04/27 02:28:01 | 5707 | 0.870086 | 0.870086 | 0.870086 | - | - | - | - | - | - | - | NMT | No | Base Transformer model with joint vocab, size 8k |
12 | Bering Lab | JPCN2en-ja | 2021/05/04 19:43:01 | 6350 | 0.872870 | 0.872870 | 0.872870 | - | - | - | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
13 | sakura | JPCN2en-ja | 2024/08/08 19:52:29 | 7277 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
|
14 | sakura | JPCN2en-ja | 2024/08/09 00:30:38 | 7299 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
|