# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | sakura | SOFTWAREms-en | 2021/04/29 13:48:03 | 5823 | - | - | - | 0.849354 | - | - | - | - | - | - | NMT | No | Multilingual finetuning of mBART50 finetuned many-to-many model, ensemble of 3 |
2 | NICT-2 | SOFTWAREms-en | 2021/05/01 12:16:10 | 5905 | - | - | - | 0.843418 | - | - | - | - | - | - | NMT | Yes | The extended mBART model, mixed domain training with domain fine-tuning. |
3 | JBJBJB | SOFTWAREms-en | 2021/05/02 23:02:00 | 5980 | - | - | - | 0.820152 | - | - | - | - | - | - | NMT | Yes | MBart50 fairseq |
4 | Bering Lab | SOFTWAREms-en | 2021/04/16 12:15:31 | 5182 | - | - | - | 0.810730 | - | - | - | - | - | - | NMT | No | Transformer trained on OPUS with GNOME,KDE4,Ubuntu weighted |
5 | NICT-2 | SOFTWAREms-en | 2021/05/01 12:03:04 | 5897 | - | - | - | 0.777501 | - | - | - | - | - | - | NMT | No | Transformer base model, multilingual + mixed domain training with domain fine-tuning. |
6 | ORGANIZER | SOFTWAREms-en | 2020/09/01 15:49:38 | 3602 | - | - | - | 0.767758 | - | - | - | - | - | - | SMT | No | Baseline MLNMT XX to En model using ALT, Ubuntu, GNOME and KDE4 data from opus.
Transformer big model. Default settings. |
7 | jyjy | SOFTWAREms-en | 2021/04/26 16:32:43 | 5642 | - | - | - | 0.713430 | - | - | - | - | - | - | NMT | No | |
8 | NICT-5 | SOFTWAREms-en | 2020/09/18 19:09:05 | 3949 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | XX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size balanced. |
9 | NICT-5 | SOFTWAREms-en | 2020/09/18 20:39:09 | 3975 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | XX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size unbalanced. |
10 | HwTscSU | SOFTWAREms-en | 2022/07/11 13:19:39 | 6752 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | XX to XX transformer model trained on GNOME,KDE4,Ubuntu as well as other data from OPUS and finetune on dev set |