# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | NLPHut | INDIC21en-te | 2021/03/20 00:20:23 | 4618 | - | - | - | - | - | - | 0.749881 | - | - | - | NMT | No | Transformer with target language tag trained using all languages PMI data. Then fine tuned using en-te PMI data.
|
2 | ORGANIZER | INDIC21en-te | 2021/04/08 17:26:14 | 4806 | - | - | - | - | - | - | 0.708086 | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
3 | NICT-5 | INDIC21en-te | 2021/04/21 15:46:17 | 5291 | - | - | - | - | - | - | 0.754015 | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
4 | SRPOL | INDIC21en-te | 2021/04/21 19:34:10 | 5334 | - | - | - | - | - | - | 0.763271 | - | - | - | NMT | No | Base transformer on all WAT21 data |
5 | NICT-5 | INDIC21en-te | 2021/04/22 11:54:22 | 5366 | - | - | - | - | - | - | 0.752068 | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
6 | gaurvar | INDIC21en-te | 2021/04/25 20:03:27 | 5587 | - | - | - | - | - | - | 0.634376 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
7 | sakura | INDIC21en-te | 2021/05/01 11:39:52 | 5891 | - | - | - | - | - | - | 0.772064 | - | - | - | NMT | No | Fine-tuning of multilingual mBART one2many model with training corpus.
|
8 | gaurvar | INDIC21en-te | 2021/05/01 19:34:57 | 5935 | - | - | - | - | - | - | 0.642502 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
9 | NLPHut | INDIC21en-te | 2021/05/03 00:19:16 | 5986 | - | - | - | - | - | - | 0.713960 | - | - | - | NMT | No | Transformer with source and target language tags trained using all languages PMI data. Then fine tuned using all en-te data. |
10 | mcairt | INDIC21en-te | 2021/05/03 17:25:18 | 5997 | - | - | - | - | - | - | 0.783647 | - | - | - | NMT | No | multilingual model(one to many model) trained on all WAT 2021 data by using base transformer. |
11 | IIIT-H | INDIC21en-te | 2021/05/03 18:11:37 | 6014 | - | - | - | - | - | - | 0.780218 | - | - | - | NMT | No | MNMT system (En-XX) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
12 | CFILT | INDIC21en-te | 2021/05/04 01:07:21 | 6051 | - | - | - | - | - | - | 0.789820 | - | - | - | NMT | No | Multilingual(One-to-Many(En-XX)) NMT model based on Transformer with shared encoder and decoder. |
13 | coastal | INDIC21en-te | 2021/05/04 01:40:50 | 6088 | - | - | - | - | - | - | 0.778251 | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
14 | sakura | INDIC21en-te | 2021/05/04 04:19:16 | 6160 | - | - | - | - | - | - | 0.785055 | - | - | - | NMT | No | Pre-training multilingual mBART one2many model with training corpus followed by finetuning on PMI Parallel.
|
15 | SRPOL | INDIC21en-te | 2021/05/04 15:23:24 | 6241 | - | - | - | - | - | - | 0.791085 | - | - | - | NMT | No | Ensemble of one-to-many on all data. Pretrained on BT, finetuned on PMI |
16 | SRPOL | INDIC21en-te | 2021/05/04 16:28:56 | 6267 | - | - | - | - | - | - | 0.792970 | - | - | - | NMT | No | One-to-many on all data. Pretrained on BT, finetuned on PMI |
17 | IITP-MT | INDIC21en-te | 2021/05/04 18:18:33 | 6305 | - | - | - | - | - | - | 0.764977 | - | - | - | NMT | No | One-to-Many model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
18 | NICT-5 | INDIC21en-te | 2021/06/25 11:39:47 | 6492 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |