# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | NLPHut | INDIC21en-hi | 2021/03/21 12:39:57 | 4632 | - | - | - | - | - | - | 0.756560 | - | - | - | NMT | No | Transformer with target language tag trained using all languages PMI data. Then finetuned using en-hi PMI data. |
2 | ORGANIZER | INDIC21en-hi | 2021/04/08 17:21:01 | 4792 | - | - | - | - | - | - | 0.759679 | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
3 | NICT-5 | INDIC21en-hi | 2021/04/21 15:41:40 | 5277 | - | - | - | - | - | - | 0.800234 | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
4 | SRPOL | INDIC21en-hi | 2021/04/21 17:48:49 | 5307 | - | - | - | - | - | - | 0.822402 | - | - | - | NMT | No | Base transformer on all WAT21 data |
5 | NICT-5 | INDIC21en-hi | 2021/04/22 11:51:24 | 5352 | - | - | - | - | - | - | 0.801680 | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
6 | gaurvar | INDIC21en-hi | 2021/04/25 19:45:07 | 5575 | - | - | - | - | - | - | 0.667688 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
7 | gaurvar | INDIC21en-hi | 2021/04/25 19:47:11 | 5576 | - | - | - | - | - | - | 0.650164 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
8 | gaurvar | INDIC21en-hi | 2021/04/25 19:50:32 | 5577 | - | - | - | - | - | - | 0.675613 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
9 | gaurvar | INDIC21en-hi | 2021/04/25 19:53:06 | 5578 | - | - | - | - | - | - | 0.676601 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
10 | sakura | INDIC21en-hi | 2021/05/01 11:26:34 | 5884 | - | - | - | - | - | - | 0.810856 | - | - | - | NMT | No | Fine-tuning of multilingual mBART one2many model with training corpus.
|
11 | gaurvar | INDIC21en-hi | 2021/05/01 19:29:03 | 5928 | - | - | - | - | - | - | 0.681511 | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
12 | gaurvar | INDIC21en-hi | 2021/05/01 19:38:48 | 5936 | - | - | - | - | - | - | 0.681514 | - | - | - | NMT | No | |
13 | NLPHut | INDIC21en-hi | 2021/05/03 00:21:26 | 5987 | - | - | - | - | - | - | 0.747598 | - | - | - | NMT | No | Transformer with target language tag trained using all languages PMI data. Then fine tuned using all en-hi data. |
14 | mcairt | INDIC21en-hi | 2021/05/03 18:00:48 | 6004 | - | - | - | - | - | - | 0.822626 | - | - | - | NMT | No | multilingual model(one to many model) trained on all WAT 2021 data by using base transformer. |
15 | IIIT-H | INDIC21en-hi | 2021/05/03 18:07:52 | 6007 | - | - | - | - | - | - | 0.822836 | - | - | - | NMT | No | MNMT system (En-XX) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
16 | CFILT | INDIC21en-hi | 2021/05/04 00:59:40 | 6043 | - | - | - | - | - | - | 0.821713 | - | - | - | NMT | No | Multilingual(One-to-Many(En-XX)) NMT model based on Transformer with shared encoder and decoder. |
17 | coastal | INDIC21en-hi | 2021/05/04 01:37:06 | 6079 | - | - | - | - | - | - | 0.801179 | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
18 | sakura | INDIC21en-hi | 2021/05/04 04:07:55 | 6152 | - | - | - | - | - | - | 0.816999 | - | - | - | NMT | No | Pre-training multilingual mBART one2many model with training corpus followed by finetuning on PMI Parallel.
|
19 | SRPOL | INDIC21en-hi | 2021/05/04 15:46:51 | 6254 | - | - | - | - | - | - | 0.824649 | - | - | - | NMT | No | Ensemble of one-to-many on all data. Pretrained on BT, finetuned on PMI |
20 | SRPOL | INDIC21en-hi | 2021/05/04 16:25:17 | 6260 | - | - | - | - | - | - | 0.822371 | - | - | - | NMT | No | One-to-many on all data. Pretrained on BT, finetuned on PMI |
21 | IITP-MT | INDIC21en-hi | 2021/05/04 17:47:15 | 6283 | - | - | - | - | - | - | 0.820543 | - | - | - | NMT | No | One-to-Many model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
22 | NICT-5 | INDIC21en-hi | 2021/06/25 11:36:13 | 6485 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |