# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | SOFTWAREth-en | 2020/09/01 15:50:20 | 3603 | - | - | - | 0.679817 | - | - | - | - | - | - | NMT | No | Baseline MLNMT XX to En model using ALT, Ubuntu, GNOME and KDE4 data from opus.
Transformer big model. Default settings. |
2 | NICT-5 | SOFTWAREth-en | 2020/09/18 19:10:49 | 3952 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | XX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size balanced. |
3 | NICT-5 | SOFTWAREth-en | 2020/09/18 20:40:14 | 3977 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | XX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size unbalanced. |
4 | Bering Lab | SOFTWAREth-en | 2021/04/16 11:25:55 | 5181 | - | - | - | 0.729468 | - | - | - | - | - | - | NMT | No | Transformer trained on OPUS with GNOME,KDE4,Ubuntu weighted |
5 | jyjy | SOFTWAREth-en | 2021/04/26 16:34:18 | 5645 | - | - | - | 0.680060 | - | - | - | - | - | - | NMT | No | |
6 | sakura | SOFTWAREth-en | 2021/04/29 17:30:21 | 5846 | - | - | - | 0.809105 | - | - | - | - | - | - | NMT | No | Multilingual finetuning of mBART50 finetuned many-to-many model, ensemble of 3 |
7 | NICT-2 | SOFTWAREth-en | 2021/05/01 12:04:58 | 5899 | - | - | - | 0.720249 | - | - | - | - | - | - | NMT | No | Transformer base model, multilingual + mixed domain training with domain fine-tuning. |
8 | NICT-2 | SOFTWAREth-en | 2021/05/01 12:18:22 | 5907 | - | - | - | 0.787909 | - | - | - | - | - | - | NMT | Yes | The extended mBART model, mixed domain training with domain fine-tuning. |
9 | JBJBJB | SOFTWAREth-en | 2021/05/02 23:03:45 | 5982 | - | - | - | 0.782966 | - | - | - | - | - | - | NMT | Yes | MBart50 fairseq |
10 | HwTscSU | SOFTWAREth-en | 2022/07/11 13:22:45 | 6755 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | XX to XX transformer model trained on GNOME,KDE4,Ubuntu as well as other data from OPUS and finetune on dev set |