# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | SRPOL | INDIC21mr-en | 2021/05/04 15:27:14 | 6247 | - | - | - | 0.812258 | - | - | - | - | - | - | NMT | No | Ensemble of many-to-one on all data. Pretrained on BT, finetuned on PMI |
2 | SRPOL | INDIC21mr-en | 2021/05/04 16:32:12 | 6273 | - | - | - | 0.810290 | - | - | - | - | - | - | NMT | No | Many-to-one on all data. Pretrained on BT, finetuned on PMI |
3 | IITP-MT | INDIC21mr-en | 2021/05/04 18:03:10 | 6292 | - | - | - | 0.797333 | - | - | - | - | - | - | NMT | No | Many-to-One model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
4 | sakura | INDIC21mr-en | 2021/04/30 22:53:25 | 5875 | - | - | - | 0.795844 | - | - | - | - | - | - | NMT | No | Fine-tuning of multilingual mBART many2many model with training corpus. |
5 | sakura | INDIC21mr-en | 2021/05/04 13:18:37 | 6207 | - | - | - | 0.795492 | - | - | - | - | - | - | NMT | No | Pre-training multilingual mBART many2many model with training corpus followed by finetuning on PMI Parallel.
|
6 | IIIT-H | INDIC21mr-en | 2021/05/03 18:14:42 | 6021 | - | - | - | 0.792878 | - | - | - | - | - | - | NMT | No | MNMT system (XX-En) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
7 | SRPOL | INDIC21mr-en | 2021/04/21 19:32:36 | 5330 | - | - | - | 0.791555 | - | - | - | - | - | - | NMT | No | Base transformer on all WAT21 data |
8 | coastal | INDIC21mr-en | 2021/05/04 05:43:03 | 6167 | - | - | - | 0.791157 | - | - | - | - | - | - | NMT | No | mT5 trained only on PMI |
9 | CFILT | INDIC21mr-en | 2021/05/04 01:15:09 | 6057 | - | - | - | 0.789075 | - | - | - | - | - | - | NMT | No | Multilingual(Many-to-One(XX-En)) NMT model based on Transformer with shared encoder and decoder. |
10 | mcairt | INDIC21mr-en | 2021/05/04 19:18:35 | 6335 | - | - | - | 0.780231 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
11 | NICT-5 | INDIC21mr-en | 2021/04/22 11:53:04 | 5359 | - | - | - | 0.779746 | - | - | - | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
12 | CFILT-IITB | INDIC21mr-en | 2021/05/04 01:58:06 | 6127 | - | - | - | 0.767347 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all Indo-Aryan languages data converted to same script |
13 | NICT-5 | INDIC21mr-en | 2021/04/21 15:44:10 | 5284 | - | - | - | 0.764852 | - | - | - | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
14 | coastal | INDIC21mr-en | 2021/05/04 01:45:39 | 6106 | - | - | - | 0.764519 | - | - | - | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
15 | CFILT-IITB | INDIC21mr-en | 2021/05/04 01:53:52 | 6118 | - | - | - | 0.751917 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all indic language data converted to same script |
16 | NLPHut | INDIC21mr-en | 2021/05/03 00:10:34 | 5983 | - | - | - | 0.696839 | - | - | - | - | - | - | NMT | No | Transformer with source language tag trained using all languages PMI data. Then fine tuned using all mr-en data. |
17 | gaurvar | INDIC21mr-en | 2021/04/25 19:00:44 | 5570 | - | - | - | 0.693109 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
18 | gaurvar | INDIC21mr-en | 2021/04/25 18:47:55 | 5560 | - | - | - | 0.692897 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
19 | NLPHut | INDIC21mr-en | 2021/03/19 16:27:22 | 4595 | - | - | - | 0.692555 | - | - | - | - | - | - | NMT | No | Transformer trained using all languages PMI data. Then fine-tuned using all mr-en data.
|
20 | gaurvar | INDIC21mr-en | 2021/04/25 18:34:13 | 5549 | - | - | - | 0.689768 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
21 | gaurvar | INDIC21mr-en | 2021/04/25 18:14:15 | 5536 | - | - | - | 0.682305 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
22 | ORGANIZER | INDIC21mr-en | 2021/04/08 17:24:07 | 4799 | - | - | - | 0.658130 | - | - | - | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
23 | NICT-5 | INDIC21mr-en | 2021/06/21 12:01:20 | 6477 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on am mbart model trained for over 5 epochs. |
24 | NICT-5 | INDIC21mr-en | 2021/06/25 11:49:30 | 6498 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |