# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | NLPHut | INDIC21kn-en | 2021/03/19 16:22:46 | 4593 | - | - | - | 0.679617 | - | - | - | - | - | - | NMT | No | Transformer with source language and target language tags trained using all languages PMI data. Then fine-tuned using kn-en PMI data.
|
2 | ORGANIZER | INDIC21kn-en | 2021/04/08 17:22:33 | 4795 | - | - | - | 0.692019 | - | - | - | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
3 | NICT-5 | INDIC21kn-en | 2021/04/21 15:43:00 | 5280 | - | - | - | 0.782087 | - | - | - | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
4 | SRPOL | INDIC21kn-en | 2021/04/21 19:31:42 | 5328 | - | - | - | 0.792687 | - | - | - | - | - | - | NMT | No | Base transformer on all WAT21 data |
5 | NICT-5 | INDIC21kn-en | 2021/04/22 11:51:57 | 5355 | - | - | - | 0.792622 | - | - | - | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
6 | gaurvar | INDIC21kn-en | 2021/04/25 18:15:23 | 5537 | - | - | - | 0.675607 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
7 | gaurvar | INDIC21kn-en | 2021/04/25 18:32:33 | 5547 | - | - | - | 0.686757 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
8 | gaurvar | INDIC21kn-en | 2021/04/25 18:45:48 | 5558 | - | - | - | 0.687726 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
9 | gaurvar | INDIC21kn-en | 2021/04/25 18:59:05 | 5568 | - | - | - | 0.687810 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
10 | sakura | INDIC21kn-en | 2021/04/30 22:45:49 | 5873 | - | - | - | 0.805112 | - | - | - | - | - | - | NMT | No | Fine-tuning of multilingual mBART many2many model with training corpus.
|
11 | IIIT-H | INDIC21kn-en | 2021/05/03 18:13:30 | 6018 | - | - | - | 0.790977 | - | - | - | - | - | - | NMT | No | MNMT system (XX-En) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
12 | CFILT | INDIC21kn-en | 2021/05/04 01:13:12 | 6055 | - | - | - | 0.778602 | - | - | - | - | - | - | NMT | No | Multilingual(Many-to-One(XX-En)) NMT model based on Transformer with shared encoder and decoder. |
13 | coastal | INDIC21kn-en | 2021/05/04 01:44:39 | 6101 | - | - | - | 0.773141 | - | - | - | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
14 | CFILT-IITB | INDIC21kn-en | 2021/05/04 01:55:24 | 6121 | - | - | - | 0.751223 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all indic language data converted to same script |
15 | CFILT-IITB | INDIC21kn-en | 2021/05/04 02:00:49 | 6131 | - | - | - | 0.744802 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all Dravidian languages data converted to same script |
16 | coastal | INDIC21kn-en | 2021/05/04 05:41:59 | 6165 | - | - | - | 0.806951 | - | - | - | - | - | - | NMT | No | mT5 trained only on PMI |
17 | sakura | INDIC21kn-en | 2021/05/04 13:14:16 | 6205 | - | - | - | 0.809702 | - | - | - | - | - | - | NMT | No | Pre-training multilingual mBART many2many model with training corpus followed by finetuning on PMI Parallel.
|
18 | SRPOL | INDIC21kn-en | 2021/05/04 15:26:08 | 6245 | - | - | - | 0.823730 | - | - | - | - | - | - | NMT | No | Ensemble of many-to-one on all data. Pretrained on BT, finetuned on PMI |
19 | SRPOL | INDIC21kn-en | 2021/05/04 16:31:24 | 6271 | - | - | - | 0.820355 | - | - | - | - | - | - | NMT | No | Many-to-one on all data. Pretrained on BT, finetuned on PMI |
20 | IITP-MT | INDIC21kn-en | 2021/05/04 17:53:28 | 6286 | - | - | - | 0.798540 | - | - | - | - | - | - | NMT | No | Many-to-One model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
21 | mcairt | INDIC21kn-en | 2021/05/04 20:02:54 | 6374 | - | - | - | 0.799216 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
22 | NICT-5 | INDIC21kn-en | 2021/06/21 14:32:56 | 6482 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on am mbart model trained for over 5 epochs. |
23 | NICT-5 | INDIC21kn-en | 2021/06/25 11:48:36 | 6496 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |