# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | SRPOL | INDIC21ml-en | 2021/05/04 15:26:45 | 6246 | - | - | - | 0.823006 | - | - | - | - | - | - | NMT | No | Ensemble of many-to-one on all data. Pretrained on BT, finetuned on PMI |
2 | SRPOL | INDIC21ml-en | 2021/05/04 16:31:48 | 6272 | - | - | - | 0.820716 | - | - | - | - | - | - | NMT | No | Many-to-one on all data. Pretrained on BT, finetuned on PMI |
3 | sakura | INDIC21ml-en | 2021/05/04 13:15:20 | 6206 | - | - | - | 0.806774 | - | - | - | - | - | - | NMT | No | Pre-training multilingual mBART many2many model with training corpus followed by finetuning on PMI Parallel.
|
4 | sakura | INDIC21ml-en | 2021/04/30 22:50:41 | 5874 | - | - | - | 0.805450 | - | - | - | - | - | - | NMT | No | Fine-tuning of multilingual mBART many2many model with training corpus.
|
5 | coastal | INDIC21ml-en | 2021/05/04 05:42:24 | 6166 | - | - | - | 0.805091 | - | - | - | - | - | - | NMT | No | mT5 trained only on PMI |
6 | IITP-MT | INDIC21ml-en | 2021/05/04 17:58:18 | 6289 | - | - | - | 0.798550 | - | - | - | - | - | - | NMT | No | Many-to-One model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
7 | mcairt | INDIC21ml-en | 2021/05/04 19:31:47 | 6344 | - | - | - | 0.794932 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
8 | SRPOL | INDIC21ml-en | 2021/04/21 19:32:10 | 5329 | - | - | - | 0.793574 | - | - | - | - | - | - | NMT | No | Base transformer on all WAT21 data |
9 | CFILT | INDIC21ml-en | 2021/05/04 01:14:00 | 6056 | - | - | - | 0.789095 | - | - | - | - | - | - | NMT | No | Multilingual(Many-to-One(XX-En)) NMT model based on Transformer with shared encoder and decoder. |
10 | NICT-5 | INDIC21ml-en | 2021/04/22 11:52:33 | 5357 | - | - | - | 0.786909 | - | - | - | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
11 | NICT-5 | INDIC21ml-en | 2021/04/21 15:43:35 | 5282 | - | - | - | 0.772691 | - | - | - | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
12 | coastal | INDIC21ml-en | 2021/05/04 01:53:04 | 6116 | - | - | - | 0.771766 | - | - | - | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
13 | IIIT-H | INDIC21ml-en | 2021/05/03 18:14:20 | 6020 | - | - | - | 0.748518 | - | - | - | - | - | - | NMT | No | MNMT system (XX-En) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
14 | CFILT-IITB | INDIC21ml-en | 2021/05/04 02:00:25 | 6130 | - | - | - | 0.745908 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all Dravidian languages data converted to same script |
15 | CFILT-IITB | INDIC21ml-en | 2021/05/04 01:53:21 | 6117 | - | - | - | 0.744459 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all indic language data converted to same script |
16 | gaurvar | INDIC21ml-en | 2021/04/25 19:00:00 | 5569 | - | - | - | 0.684483 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
17 | gaurvar | INDIC21ml-en | 2021/04/25 18:46:34 | 5559 | - | - | - | 0.684370 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
18 | gaurvar | INDIC21ml-en | 2021/04/25 18:33:28 | 5548 | - | - | - | 0.678349 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
19 | gaurvar | INDIC21ml-en | 2021/04/25 18:16:15 | 5538 | - | - | - | 0.669241 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
20 | NLPHut | INDIC21ml-en | 2021/03/21 12:45:25 | 4634 | - | - | - | 0.668778 | - | - | - | - | - | - | NMT | No | Transformer with source and target language tags trained using all languages PMI data. Then finetuned using ml-en PMI data. |
21 | ORGANIZER | INDIC21ml-en | 2021/04/08 17:23:30 | 4797 | - | - | - | 0.646559 | - | - | - | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
22 | NICT-5 | INDIC21ml-en | 2021/06/21 12:00:00 | 6476 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on am mbart model trained for over 5 epochs. |
23 | NICT-5 | INDIC21ml-en | 2021/06/25 11:49:02 | 6497 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |