# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | Bering Lab | JPCja-ko | 2021/04/09 16:32:49 | 4861 | - | - | 0.948440 | - | - | - | - | - | - | - | NMT | No | Transformer ensemble |
2 | Bering Lab | JPCja-ko | 2021/04/25 03:14:30 | 5518 | - | - | 0.946778 | - | - | - | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
3 | ORGANIZER | JPCja-ko | 2018/08/15 18:46:18 | 1967 | - | - | 0.848830 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
4 | transictkt | JPCja-ko | 2016/08/21 19:14:38 | 1268 | - | - | 0.846880 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT. |
5 | ORGANIZER | JPCja-ko | 2016/07/21 14:03:13 | 1020 | - | - | 0.844950 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT
|
6 | ORGANIZER | JPCja-ko | 2016/07/21 14:05:38 | 1021 | - | - | 0.844550 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Hierarchical Phrase-based SMT |
7 | ORGANIZER | JPCja-ko | 2016/07/26 10:22:26 | 1037 | - | - | 0.791320 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2016) |
8 | ORGANIZER | JPCja-ko | 2016/08/05 13:56:04 | 1083 | - | - | 0.766520 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT C (2016) |
9 | ORGANIZER | JPCja-ko | 2016/08/05 14:26:14 | 1089 | - | - | 0.765530 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | RBMT D (2016) |
10 | ORGANIZER | JPCja-ko | 2016/08/01 17:06:28 | 1068 | - | - | 0.692980 | - | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online B (2016) |
11 | ryan | JPCja-ko | 2019/07/23 12:27:29 | 2843 | - | - | 0.000000 | - | - | - | - | - | - | - | NMT | No | Base Transformer |
12 | goku20 | JPCja-ko | 2020/09/21 12:12:30 | 4092 | - | - | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
13 | goku20 | JPCja-ko | 2020/09/22 10:20:02 | 4121 | - | - | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
14 | sakura | JPCja-ko | 2024/08/08 19:45:25 | 7271 | - | - | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) - Best
|
15 | sakura | JPCja-ko | 2024/08/09 00:22:30 | 7295 | - | - | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh)
|