# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | mcairt | INDIC21en-bn | 2021/05/03 17:45:45 | 6000 | - | - | - | - | - | - | 0.779592 | - | - | - | NMT | No | multilingual model(one to many model) trained on all WAT 2021 data by using base transformer. |
2 | CFILT | INDIC21en-bn | 2021/05/04 00:54:24 | 6041 | - | - | - | - | - | - | 0.777074 | - | - | - | NMT | No | Multilingual(One-to-Many(En-XX)) NMT model based on Transformer with shared encoder and decoder. |
3 | SRPOL | INDIC21en-bn | 2021/05/04 16:22:25 | 6258 | - | - | - | - | - | - | 0.772309 | - | - | - | NMT | No | One-to-many on all data. Pretrained on BT, finetuned on PMI |
4 | SRPOL | INDIC21en-bn | 2021/05/04 15:11:17 | 6232 | - | - | - | - | - | - | 0.771033 | - | - | - | NMT | No | Ensemble of one-to-many on all data. Pretrained on BT, finetuned on PMI |
5 | NICT-5 | INDIC21en-bn | 2021/04/22 11:50:36 | 5348 | - | - | - | - | - | - | 0.767497 | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
6 | SRPOL | INDIC21en-bn | 2021/04/21 19:16:34 | 5316 | - | - | - | - | - | - | 0.764717 | - | - | - | NMT | No | Base transformer on all WAT21 data |
7 | sakura | INDIC21en-bn | 2021/05/04 04:04:48 | 6150 | - | - | - | - | - | - | 0.764714 | - | - | - | NMT | No | Pre-training multilingual mBART one2many model with training corpus followed by finetuning on PMI Parallel.
|
8 | coastal | INDIC21en-bn | 2021/05/04 01:34:07 | 6074 | - | - | - | - | - | - | 0.763665 | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
9 | IIIT-H | INDIC21en-bn | 2021/05/03 18:03:37 | 6005 | - | - | - | - | - | - | 0.759513 | - | - | - | NMT | No | MNMT system (En-XX) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
10 | NICT-5 | INDIC21en-bn | 2021/04/21 15:40:02 | 5273 | - | - | - | - | - | - | 0.755363 | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
11 | sakura | INDIC21en-bn | 2021/05/01 11:23:46 | 5882 | - | - | - | - | - | - | 0.737663 | - | - | - | NMT | No | Fine-tuning of multilingual mBART one2many model with training corpus.
|
12 | IITP-MT | INDIC21en-bn | 2021/05/04 17:33:40 | 6278 | - | - | - | - | - | - | 0.731181 | - | - | - | NMT | No | One-to-Many model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
13 | ORGANIZER | INDIC21en-bn | 2021/04/08 17:18:12 | 4788 | - | - | - | - | - | - | 0.701527 | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
14 | gaurvar | INDIC21en-bn | 2021/05/01 19:42:11 | 5938 | - | - | - | - | - | - | 0.641712 | - | - | - | NMT | No | |
15 | gaurvar | INDIC21en-bn | 2021/05/01 19:27:06 | 5926 | - | - | - | - | - | - | 0.641316 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
16 | gaurvar | INDIC21en-bn | 2021/05/01 19:40:58 | 5937 | - | - | - | - | - | - | 0.641316 | - | - | - | NMT | No | |
17 | gaurvar | INDIC21en-bn | 2021/04/25 19:55:11 | 5579 | - | - | - | - | - | - | 0.628843 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
18 | gaurvar | INDIC21en-bn | 2021/04/25 20:05:03 | 5588 | - | - | - | - | - | - | 0.628707 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
19 | NICT-5 | INDIC21en-bn | 2021/06/25 11:35:23 | 6483 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |