# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | SOFTWAREen-hi | 2020/09/01 15:58:44 | 3608 | - | - | - | - | - | - | 0.713394 | - | - | - | NMT | No | Baseline MLNMT En to XX model using ALT, Ubuntu, GNOME and KDE4 data from opus.
Transformer big model. Default settings. |
2 | NICT-5 | SOFTWAREen-hi | 2020/09/18 19:05:09 | 3944 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | XX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size balanced. |
3 | NICT-5 | SOFTWAREen-hi | 2020/09/18 19:12:51 | 3955 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | XX to XX transformer model trained on ALT as well as KDE, GNOME and Ubuntu data from OPUS. Corpora were size unbalanced. |
4 | Bering Lab | SOFTWAREen-hi | 2021/04/17 09:52:42 | 5190 | - | - | - | - | - | - | 0.810847 | - | - | - | NMT | No | Transformer trained on OPUS with GNOME,KDE4,Ubuntu weighted |
5 | jyjy | SOFTWAREen-hi | 2021/04/23 20:03:06 | 5440 | - | - | - | - | - | - | 0.728430 | - | - | - | NMT | No | |
6 | sakura | SOFTWAREen-hi | 2021/04/29 11:59:47 | 5792 | - | - | - | - | - | - | 0.826771 | - | - | - | NMT | No | Multilingual finetuning of mBART50 finetuned many-to-many model, ensemble of 3 |
7 | NICT-2 | SOFTWAREen-hi | 2021/05/01 11:54:05 | 5892 | - | - | - | - | - | - | 0.777014 | - | - | - | NMT | No | Transformer base model, multilingual + mixed domain training with domain fine-tuning. |
8 | NICT-2 | SOFTWAREen-hi | 2021/05/01 12:07:33 | 5900 | - | - | - | - | - | - | 0.821077 | - | - | - | NMT | Yes | The extended mBART model, mixed domain training with domain fine-tuning. |
9 | JBJBJB | SOFTWAREen-hi | 2021/05/02 22:57:23 | 5975 | - | - | - | - | - | - | 0.827720 | - | - | - | NMT | Yes | MBart50 fairseq |
10 | HwTscSU | SOFTWAREen-hi | 2022/07/11 13:16:17 | 6747 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | XX to XX transformer model trained on GNOME,KDE4,Ubuntu as well as other data from OPUS and finetune on dev set |