# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | NLPHut | INDIC21hi-en | 2021/03/19 16:11:51 | 4589 | - | - | - | 0.714917 | - | - | - | - | - | - | NMT | No | Transformer trained using all languages PMI data. Then fine-tuned using all hi-en data.
|
2 | ORGANIZER | INDIC21hi-en | 2021/04/08 17:21:24 | 4793 | - | - | - | 0.736131 | - | - | - | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
3 | NICT-5 | INDIC21hi-en | 2021/04/21 15:41:58 | 5278 | - | - | - | 0.808180 | - | - | - | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
4 | SRPOL | INDIC21hi-en | 2021/04/21 19:31:15 | 5327 | - | - | - | 0.825005 | - | - | - | - | - | - | NMT | No | Base transformer on all WAT21 data |
5 | NICT-5 | INDIC21hi-en | 2021/04/22 11:51:35 | 5353 | - | - | - | 0.805716 | - | - | - | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
6 | gaurvar | INDIC21hi-en | 2021/04/25 17:35:48 | 5531 | - | - | - | 0.714649 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
7 | gaurvar | INDIC21hi-en | 2021/04/25 18:08:20 | 5532 | - | - | - | 0.714649 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
8 | gaurvar | INDIC21hi-en | 2021/04/25 18:38:48 | 5554 | - | - | - | 0.722822 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
9 | gaurvar | INDIC21hi-en | 2021/04/25 18:41:08 | 5555 | - | - | - | 0.724694 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
10 | gaurvar | INDIC21hi-en | 2021/04/25 18:58:16 | 5567 | - | - | - | 0.722822 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
11 | sakura | INDIC21hi-en | 2021/04/30 22:40:56 | 5872 | - | - | - | 0.834172 | - | - | - | - | - | - | NMT | No | Fine-tuning of multilingual mBART many2many model with training corpus.
|
12 | NLPHut | INDIC21hi-en | 2021/05/03 00:15:59 | 5985 | - | - | - | 0.721805 | - | - | - | - | - | - | NMT | No | Transformer with source and target language tags trained using all languages PMI data. Then fine tuned using all hi-en data. |
13 | IIIT-H | INDIC21hi-en | 2021/05/03 18:12:58 | 6017 | - | - | - | 0.823007 | - | - | - | - | - | - | NMT | No | MNMT system (XX-En) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
14 | CFILT | INDIC21hi-en | 2021/05/04 01:11:55 | 6054 | - | - | - | 0.822034 | - | - | - | - | - | - | NMT | No | Multilingual(Many-to-One(XX-En)) NMT model based on Transformer with shared encoder and decoder. |
15 | coastal | INDIC21hi-en | 2021/05/04 01:44:10 | 6099 | - | - | - | 0.786466 | - | - | - | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
16 | CFILT-IITB | INDIC21hi-en | 2021/05/04 01:52:55 | 6115 | - | - | - | 0.775032 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all Indo-Aryan languages data converted to same script |
17 | CFILT-IITB | INDIC21hi-en | 2021/05/04 01:57:40 | 6126 | - | - | - | 0.791408 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Trained using transformer based model using all Indo-Aryan languages with shared encoder-decoder and shared BPE vocabulary with all indic language data converted to sam |
18 | coastal | INDIC21hi-en | 2021/05/04 05:41:33 | 6164 | - | - | - | 0.824040 | - | - | - | - | - | - | NMT | No | mT5 trained only on PMI |
19 | sakura | INDIC21hi-en | 2021/05/04 13:13:06 | 6204 | - | - | - | 0.834538 | - | - | - | - | - | - | NMT | No | Pre-training multilingual mBART many2many model with training corpus followed by finetuning on PMI Parallel.
|
20 | SRPOL | INDIC21hi-en | 2021/05/04 15:25:38 | 6244 | - | - | - | 0.847064 | - | - | - | - | - | - | NMT | No | Ensemble of many-to-one on all data. Pretrained on BT, finetuned on PMI |
21 | SRPOL | INDIC21hi-en | 2021/05/04 16:30:25 | 6270 | - | - | - | 0.843456 | - | - | - | - | - | - | NMT | No | Many-to-one on all data. Pretrained on BT, finetuned on PMI |
22 | IITP-MT | INDIC21hi-en | 2021/05/04 17:49:10 | 6284 | - | - | - | 0.831265 | - | - | - | - | - | - | NMT | No | Many-to-One model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
23 | mcairt | INDIC21hi-en | 2021/05/04 19:12:14 | 6333 | - | - | - | 0.832119 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
24 | NICT-5 | INDIC21hi-en | 2021/06/21 11:55:03 | 6475 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on am mbart model trained for over 5 epochs. |
25 | NICT-5 | INDIC21hi-en | 2021/06/25 11:48:12 | 6495 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |