# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ryan | JPCN3ko-ja | 2019/07/23 12:03:36 | 2840 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Base Transformer |
2 | sarah | JPCN3ko-ja | 2019/07/26 16:28:31 | 3017 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
3 | KNU_Hyundai | JPCN3ko-ja | 2019/07/27 00:12:31 | 3096 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer + Relative Position + ensemble |
4 | TMU | JPCN3ko-ja | 2020/09/17 22:51:54 | 3834 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer, domain adaptation (BERT Japanese), hanja loss, shared EMB, shared BPE, ensemble of 4 models |
5 | TMU | JPCN3ko-ja | 2020/09/17 22:53:20 | 3835 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer, domain adaptation (BERT Japanese), shared EMB, shared BPE, ensemble of 4 models |
6 | goku20 | JPCN3ko-ja | 2020/09/21 12:27:10 | 4101 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
7 | goku20 | JPCN3ko-ja | 2020/09/22 10:22:53 | 4124 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
8 | TMU | JPCN3ko-ja | 2020/10/13 09:48:09 | 4140 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer, domain adaptation (BERT Korean), hanja loss, shared EMB, shared BPE, ensemble of 4 models |
9 | TMU | JPCN3ko-ja | 2020/10/13 10:05:00 | 4144 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer, domain adaptation (BERT Korean), shared EMB, shared BPE, ensemble of 4 models |
10 | Bering Lab | JPCN3ko-ja | 2021/04/25 03:01:59 | 5515 | 0.908463 | 0.908463 | 0.908463 | - | - | - | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
11 | tpt_wat | JPCN3ko-ja | 2021/04/27 02:09:34 | 5701 | 0.898732 | 0.898732 | 0.898732 | - | - | - | - | - | - | - | NMT | No | Base Transformer model with joint vocab, size 8k |
12 | GenAI | JPCN3ko-ja | 2024/07/23 22:43:03 | 7145 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Terminology based LLM Translator |
13 | GenAI | JPCN3ko-ja | 2024/07/24 13:08:38 | 7148 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | only for test (by chatgpt) |
14 | GenAI | JPCN3ko-ja | 2024/08/01 18:00:27 | 7180 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Terminology based LLM Translator |
15 | GenAI | JPCN3ko-ja | 2024/08/06 20:36:49 | 7234 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Terminology based LLM Translator (mistral-nemo lora) |
16 | sakura | JPCN3ko-ja | 2024/08/09 00:56:13 | 7310 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) |
17 | sakura | JPCN3ko-ja | 2024/08/09 00:57:56 | 7311 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best |
18 | ORGANIZER | JPCN3ko-ja | 2018/08/15 16:26:48 | 1948 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
19 | EHR | JPCN3ko-ja | 2018/08/31 19:05:13 | 2217 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | SMT reranked NMT |