# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | SRPOL | INDIC21ta-en | 2021/05/04 15:28:46 | 6250 | - | - | - | 0.806540 | - | - | - | - | - | - | NMT | No | Ensemble of many-to-one on all data. Pretrained on BT, finetuned on PMI |
2 | SRPOL | INDIC21ta-en | 2021/05/04 16:33:25 | 6276 | - | - | - | 0.803595 | - | - | - | - | - | - | NMT | No | Many-to-one on all data. Pretrained on BT, finetuned on PMI |
3 | sakura | INDIC21ta-en | 2021/04/30 23:00:52 | 5878 | - | - | - | 0.796074 | - | - | - | - | - | - | NMT | No | Fine-tuning of multilingual mBART many2many model with training corpus.
|
4 | sakura | INDIC21ta-en | 2021/05/04 13:21:24 | 6210 | - | - | - | 0.790353 | - | - | - | - | - | - | NMT | No | Pre-training multilingual mBART many2many model with training corpus followed by finetuning on PMI Parallel.
|
5 | mcairt | INDIC21ta-en | 2021/05/04 19:35:58 | 6346 | - | - | - | 0.790184 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
6 | SRPOL | INDIC21ta-en | 2021/04/21 19:33:54 | 5333 | - | - | - | 0.788439 | - | - | - | - | - | - | NMT | No | Base transformer on all WAT21 data |
7 | IITP-MT | INDIC21ta-en | 2021/05/04 18:16:18 | 6304 | - | - | - | 0.786587 | - | - | - | - | - | - | NMT | No | Many-to-One model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
8 | coastal | INDIC21ta-en | 2021/05/04 05:43:54 | 6169 | - | - | - | 0.786098 | - | - | - | - | - | - | NMT | No | mT5 trained only on PMI |
9 | CFILT | INDIC21ta-en | 2021/05/04 01:18:09 | 6060 | - | - | - | 0.785098 | - | - | - | - | - | - | NMT | No | Multilingual(Many-to-One(XX-En)) NMT model based on Transformer with shared encoder and decoder. |
10 | NICT-5 | INDIC21ta-en | 2021/04/22 11:54:11 | 5365 | - | - | - | 0.772249 | - | - | - | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
11 | coastal | INDIC21ta-en | 2021/05/04 01:47:06 | 6110 | - | - | - | 0.764269 | - | - | - | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
12 | NICT-5 | INDIC21ta-en | 2021/04/21 15:45:59 | 5290 | - | - | - | 0.758282 | - | - | - | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
13 | IIIT-H | INDIC21ta-en | 2021/05/03 18:16:01 | 6024 | - | - | - | 0.750297 | - | - | - | - | - | - | NMT | No | MNMT system (XX-En) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
14 | CFILT-IITB | INDIC21ta-en | 2021/05/04 02:01:23 | 6132 | - | - | - | 0.745090 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all Dravidian languages data converted to same script |
15 | CFILT-IITB | INDIC21ta-en | 2021/05/04 01:55:45 | 6122 | - | - | - | 0.742311 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all indic language data converted to same script |
16 | gaurvar | INDIC21ta-en | 2021/04/25 19:03:52 | 5573 | - | - | - | 0.688325 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
17 | gaurvar | INDIC21ta-en | 2021/04/25 18:49:58 | 5563 | - | - | - | 0.687892 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
18 | ORGANIZER | INDIC21ta-en | 2021/04/08 17:25:52 | 4805 | - | - | - | 0.675969 | - | - | - | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
19 | gaurvar | INDIC21ta-en | 2021/04/25 18:36:45 | 5552 | - | - | - | 0.674214 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
20 | NLPHut | INDIC21ta-en | 2021/05/03 00:13:56 | 5984 | - | - | - | 0.669984 | - | - | - | - | - | - | NMT | No | Transformer with source and target language tags trained using all languages PMI data. Then fine tuned using all ta-en data. |
21 | NLPHut | INDIC21ta-en | 2021/03/20 00:18:36 | 4617 | - | - | - | 0.663932 | - | - | - | - | - | - | NMT | No | Transformer with source and target language tags trained using all languages PMI data. Then fine tuned using ta-en PMI data.
|
22 | gaurvar | INDIC21ta-en | 2021/04/25 18:17:09 | 5539 | - | - | - | 0.663642 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
23 | NICT-5 | INDIC21ta-en | 2021/06/21 12:05:26 | 6480 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on am mbart model trained for over 5 epochs. |
24 | NICT-5 | INDIC21ta-en | 2021/06/25 11:50:44 | 6501 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |