# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | JPCN3zh-ja | 2018/08/15 15:03:39 | 1942 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
2 | USTC | JPCN3zh-ja | 2018/08/31 17:35:32 | 2208 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | tensor2tensor, 4 model average, r2l rerank |
3 | EHR | JPCN3zh-ja | 2018/08/31 18:56:18 | 2212 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | 0.000000 | 0.000000 | NMT | No | SMT reranked NMT |
4 | ryan | JPCN3zh-ja | 2019/07/25 22:09:15 | 2952 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Base Transformer |
5 | sarah | JPCN3zh-ja | 2019/07/26 11:35:32 | 2981 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | Transformer, ensemble of 4 models |
6 | goku20 | JPCN3zh-ja | 2020/09/21 12:24:37 | 4099 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, single model |
7 | goku20 | JPCN3zh-ja | 2020/09/22 00:16:13 | 4114 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | mBART pre-training transformer, ensemble of 3 models |
8 | tpt_wat | JPCN3zh-ja | 2021/04/27 01:45:05 | 5692 | 0.955358 | 0.955358 | 0.955358 | - | - | - | - | - | - | - | NMT | No | Base Transformer model with shared vocab 8k size |
9 | Bering Lab | JPCN3zh-ja | 2021/05/04 10:55:07 | 6198 | 0.961969 | 0.961969 | 0.961969 | - | - | - | - | - | - | - | NMT | Yes | Transformer Ensemble with additional crawled parallel corpus |
10 | sakura | JPCN3zh-ja | 2024/08/09 00:35:08 | 7301 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) |
11 | sakura | JPCN3zh-ja | 2024/08/09 00:37:23 | 7302 | 0.000000 | 0.000000 | 0.000000 | - | - | - | - | - | - | - | NMT | No | LLM: Rakuten/RakutenAI-7B-chat Fine-Tuned with JPC Corpus in six direction (En-Ja, Ja-En, Ko-Ja, Ja-Ko, Zh-Ja, Ja-Zh) -Best |