# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | sakura | ALT20ms-en | 2021/04/29 13:46:42 | 5821 | - | - | - | 0.851471 | - | - | - | - | - | - | NMT | No | Multilingual finetuning of mBART50 finetuned many-to-many model, ensemble of 3 |
2 | NICT-2 | ALT20ms-en | 2021/05/01 13:35:25 | 5921 | - | - | - | 0.841632 | - | - | - | - | - | - | NMT | Yes | The extended mBART model, mixed domain training with domain fine-tuning. |
3 | NICT-2 | ALT20ms-en | 2021/05/01 13:21:03 | 5913 | - | - | - | 0.697618 | - | - | - | - | - | - | NMT | No | Transformer base model, multilingual + mixed domain training with domain fine-tuning. |
4 | ORGANIZER | ALT20ms-en | 2020/09/01 15:52:35 | 3606 | - | - | - | 0.655865 | - | - | - | - | - | - | NMT | No | Baseline MLNMT XX to En model using ALT, Ubuntu, GNOME and KDE4 data from opus.
Transformer big model. Default settings. |
5 | NICT-5 | ALT20ms-en | 2020/09/18 19:17:10 | 3959 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | XX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size balanced. |
6 | NICT-5 | ALT20ms-en | 2020/09/18 21:53:32 | 4016 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | XX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size unbalanced. |
7 | HwTscSU | ALT20ms-en | 2022/07/11 18:27:45 | 6775 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | XX to XX transformer model finetune on the baseline trained on IT domain data |