# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | SRPOL | INDIC21en-ml | 2021/05/04 16:26:16 | 6262 | - | - | - | - | - | - | 0.808089 | - | - | - | NMT | No | One-to-many on all data. Pretrained on BT, finetuned on PMI |
2 | SRPOL | INDIC21en-ml | 2021/05/04 15:20:32 | 6236 | - | - | - | - | - | - | 0.807998 | - | - | - | NMT | No | Ensemble of one-to-many on all data. Pretrained on BT, finetuned on PMI |
3 | CFILT | INDIC21en-ml | 2021/05/04 01:01:47 | 6046 | - | - | - | - | - | - | 0.805291 | - | - | - | NMT | No | Multilingual(One-to-Many(En-XX)) NMT model based on Transformer with shared encoder and decoder. |
4 | sakura | INDIC21en-ml | 2021/05/04 04:10:37 | 6154 | - | - | - | - | - | - | 0.801941 | - | - | - | NMT | No | Pre-training multilingual mBART one2many model with training corpus followed by finetuning on PMI Parallel.
|
5 | SRPOL | INDIC21en-ml | 2021/04/21 19:20:16 | 5319 | - | - | - | - | - | - | 0.798649 | - | - | - | NMT | No | Base transformer on all WAT21 data |
6 | sakura | INDIC21en-ml | 2021/05/01 11:30:09 | 5886 | - | - | - | - | - | - | 0.794481 | - | - | - | NMT | No | Fine-tuning of multilingual mBART one2many model with training corpus.
|
7 | mcairt | INDIC21en-ml | 2021/05/03 17:52:33 | 6002 | - | - | - | - | - | - | 0.793308 | - | - | - | NMT | No | multilingual model(one to many model) trained on all WAT 2021 data by using base transformer. |
8 | NICT-5 | INDIC21en-ml | 2021/04/22 11:52:07 | 5356 | - | - | - | - | - | - | 0.789337 | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
9 | coastal | INDIC21en-ml | 2021/05/04 01:38:17 | 6081 | - | - | - | - | - | - | 0.784292 | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
10 | NICT-5 | INDIC21en-ml | 2021/04/21 15:43:18 | 5281 | - | - | - | - | - | - | 0.764924 | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
11 | IITP-MT | INDIC21en-ml | 2021/05/04 17:56:02 | 6287 | - | - | - | - | - | - | 0.758960 | - | - | - | NMT | No | One-to-Many model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
12 | IIIT-H | INDIC21en-ml | 2021/05/03 18:08:56 | 6009 | - | - | - | - | - | - | 0.745043 | - | - | - | NMT | No | MNMT system (En-XX) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
13 | NLPHut | INDIC21en-ml | 2021/03/19 16:16:46 | 4590 | - | - | - | - | - | - | 0.740136 | - | - | - | NMT | No | Transformer with target language tag trained using all languages PMI data. Then fine-tuned using en-ml PMI data. |
14 | ORGANIZER | INDIC21en-ml | 2021/04/08 17:22:55 | 4796 | - | - | - | - | - | - | 0.706782 | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
15 | gaurvar | INDIC21en-ml | 2021/04/25 19:58:41 | 5582 | - | - | - | - | - | - | 0.666547 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
16 | gaurvar | INDIC21en-ml | 2021/05/01 19:30:52 | 5930 | - | - | - | - | - | - | 0.656847 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
17 | NICT-5 | INDIC21en-ml | 2021/06/25 11:38:07 | 6487 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |