# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | SRPOL | INDIC21gu-en | 2021/05/04 15:25:07 | 6243 | - | - | - | 0.835789 | - | - | - | - | - | - | NMT | No | Ensemble of many-to-one on all data. Pretrained on BT, finetuned on PMI |
2 | SRPOL | INDIC21gu-en | 2021/05/04 16:29:47 | 6269 | - | - | - | 0.833146 | - | - | - | - | - | - | NMT | No | Many-to-one on all data. Pretrained on BT, finetuned on PMI |
3 | sakura | INDIC21gu-en | 2021/04/30 22:37:18 | 5871 | - | - | - | 0.820654 | - | - | - | - | - | - | NMT | No | Fine-tuning of multilingual mBART many2many model with training corpus.
|
4 | mcairt | INDIC21gu-en | 2021/05/04 19:15:33 | 6334 | - | - | - | 0.819546 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
5 | sakura | INDIC21gu-en | 2021/05/04 13:12:11 | 6203 | - | - | - | 0.818623 | - | - | - | - | - | - | NMT | No | Pre-training multilingual mBART many2many model with training corpus followed by finetuning on PMI Parallel.
|
6 | IITP-MT | INDIC21gu-en | 2021/05/04 17:44:27 | 6282 | - | - | - | 0.814556 | - | - | - | - | - | - | NMT | No | Many-to-One model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
7 | coastal | INDIC21gu-en | 2021/05/04 05:41:08 | 6163 | - | - | - | 0.814168 | - | - | - | - | - | - | NMT | No | mT5 trained only on PMI |
8 | SRPOL | INDIC21gu-en | 2021/04/21 19:30:48 | 5326 | - | - | - | 0.812870 | - | - | - | - | - | - | NMT | No | Base transformer on all WAT21 data |
9 | IIIT-H | INDIC21gu-en | 2021/05/03 18:12:36 | 6016 | - | - | - | 0.806061 | - | - | - | - | - | - | NMT | No | MNMT system (XX-En) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
10 | CFILT | INDIC21gu-en | 2021/05/04 01:11:08 | 6053 | - | - | - | 0.797069 | - | - | - | - | - | - | NMT | No | Multilingual(Many-to-One(XX-En)) NMT model based on Transformer with shared encoder and decoder. |
11 | NICT-5 | INDIC21gu-en | 2021/04/22 11:51:12 | 5351 | - | - | - | 0.796604 | - | - | - | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
12 | NICT-5 | INDIC21gu-en | 2021/04/21 15:41:23 | 5276 | - | - | - | 0.793874 | - | - | - | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
13 | coastal | INDIC21gu-en | 2021/05/04 01:43:37 | 6096 | - | - | - | 0.779452 | - | - | - | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
14 | CFILT-IITB | INDIC21gu-en | 2021/05/04 01:57:18 | 6125 | - | - | - | 0.776935 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all Indo-Aryan languages data converted to same script |
15 | CFILT-IITB | INDIC21gu-en | 2021/05/04 01:52:25 | 6114 | - | - | - | 0.765441 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all indic language data converted to same script |
16 | ORGANIZER | INDIC21gu-en | 2021/04/08 17:20:38 | 4791 | - | - | - | 0.726576 | - | - | - | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
17 | gaurvar | INDIC21gu-en | 2021/04/25 18:57:22 | 5566 | - | - | - | 0.698257 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
18 | gaurvar | INDIC21gu-en | 2021/04/25 18:44:52 | 5557 | - | - | - | 0.696879 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 model trained for multiple indic languages |
19 | gaurvar | INDIC21gu-en | 2021/04/25 18:31:36 | 5546 | - | - | - | 0.696278 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
20 | gaurvar | INDIC21gu-en | 2021/04/25 18:13:05 | 5535 | - | - | - | 0.687977 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
21 | NICT-5 | INDIC21gu-en | 2021/06/21 11:53:21 | 6474 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on am mbart model trained for over 5 epochs. |
22 | NICT-5 | INDIC21gu-en | 2021/06/25 11:47:19 | 6494 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |