# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | NLPHut | INDIC21te-en | 2021/03/20 00:22:00 | 4619 | - | - | - | 0.674821 | - | - | - | - | - | - | NMT | No | Transformer with source language and target language tags trained using all languages PMI data. Then fine tuned using te-en PMI data.
|
2 | ORGANIZER | INDIC21te-en | 2021/04/08 17:26:31 | 4807 | - | - | - | 0.636031 | - | - | - | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
3 | NICT-5 | INDIC21te-en | 2021/04/21 15:46:34 | 5292 | - | - | - | 0.771109 | - | - | - | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
4 | SRPOL | INDIC21te-en | 2021/04/21 19:34:19 | 5335 | - | - | - | 0.787243 | - | - | - | - | - | - | NMT | No | Base transformer on all WAT21 data |
5 | NICT-5 | INDIC21te-en | 2021/04/22 11:55:37 | 5367 | - | - | - | 0.779053 | - | - | - | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
6 | gaurvar | INDIC21te-en | 2021/04/25 18:17:54 | 5540 | - | - | - | 0.651247 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
7 | gaurvar | INDIC21te-en | 2021/04/25 18:27:02 | 5543 | - | - | - | 0.652473 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
8 | gaurvar | INDIC21te-en | 2021/04/25 18:30:09 | 5545 | - | - | - | 0.661749 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
9 | gaurvar | INDIC21te-en | 2021/04/25 18:37:32 | 5553 | - | - | - | 0.661749 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
10 | gaurvar | INDIC21te-en | 2021/04/25 18:51:20 | 5564 | - | - | - | 0.668328 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
11 | gaurvar | INDIC21te-en | 2021/04/25 19:04:37 | 5574 | - | - | - | 0.666143 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
12 | sakura | INDIC21te-en | 2021/04/30 23:05:00 | 5879 | - | - | - | 0.802030 | - | - | - | - | - | - | NMT | No | Fine-tuning of multilingual mBART many2many model with training corpus.
|
13 | IIIT-H | INDIC21te-en | 2021/05/03 18:16:21 | 6025 | - | - | - | 0.754690 | - | - | - | - | - | - | NMT | No | MNMT system (XX-En) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
14 | CFILT | INDIC21te-en | 2021/05/04 01:18:45 | 6061 | - | - | - | 0.783349 | - | - | - | - | - | - | NMT | No | Multilingual(Many-to-One(XX-En)) NMT model based on Transformer with shared encoder and decoder. |
15 | coastal | INDIC21te-en | 2021/05/04 01:47:32 | 6111 | - | - | - | 0.773075 | - | - | - | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
16 | CFILT-IITB | INDIC21te-en | 2021/05/04 01:54:47 | 6120 | - | - | - | 0.743435 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all indic language data converted to same script |
17 | CFILT-IITB | INDIC21te-en | 2021/05/04 02:04:18 | 6133 | - | - | - | 0.745885 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all Dravidian languages data converted to same script |
18 | coastal | INDIC21te-en | 2021/05/04 05:44:19 | 6170 | - | - | - | 0.799696 | - | - | - | - | - | - | NMT | No | mT5 trained only on PMI |
19 | sakura | INDIC21te-en | 2021/05/04 13:22:13 | 6211 | - | - | - | 0.803983 | - | - | - | - | - | - | NMT | No | Pre-training multilingual mBART many2many model with training corpus followed by finetuning on PMI Parallel.
|
20 | SRPOL | INDIC21te-en | 2021/05/04 15:29:15 | 6251 | - | - | - | 0.820889 | - | - | - | - | - | - | NMT | No | Ensemble of many-to-one on all data. Pretrained on BT, finetuned on PMI |
21 | SRPOL | INDIC21te-en | 2021/05/04 16:33:50 | 6277 | - | - | - | 0.820360 | - | - | - | - | - | - | NMT | No | Many-to-one on all data. Pretrained on BT, finetuned on PMI |
22 | IITP-MT | INDIC21te-en | 2021/05/04 18:20:14 | 6306 | - | - | - | 0.776964 | - | - | - | - | - | - | NMT | No | Many-to-One model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
23 | mcairt | INDIC21te-en | 2021/05/04 19:41:45 | 6348 | - | - | - | 0.786396 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
24 | NICT-5 | INDIC21te-en | 2021/06/21 12:06:37 | 6481 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on am mbart model trained for over 5 epochs. |
25 | NICT-5 | INDIC21te-en | 2021/06/25 11:51:05 | 6502 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |