# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | NLPHut | INDIC21or-en | 2021/03/19 16:31:18 | 4597 | - | - | - | 0.740606 | - | - | - | - | - | - | NMT | No | Transformer trained using all or-en data and all hi-en data. Then fine-tuned using all or-en data. |
2 | ORGANIZER | INDIC21or-en | 2021/04/08 17:24:42 | 4801 | - | - | - | 0.730819 | - | - | - | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
3 | NICT-5 | INDIC21or-en | 2021/04/21 15:44:47 | 5286 | - | - | - | 0.780431 | - | - | - | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
4 | SRPOL | INDIC21or-en | 2021/04/21 19:33:01 | 5331 | - | - | - | 0.794017 | - | - | - | - | - | - | NMT | No | Base transformer on all WAT21 data |
5 | NICT-5 | INDIC21or-en | 2021/04/22 11:53:28 | 5361 | - | - | - | 0.782917 | - | - | - | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
6 | gaurvar | INDIC21or-en | 2021/04/25 18:18:26 | 5541 | - | - | - | 0.726829 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
7 | gaurvar | INDIC21or-en | 2021/04/25 18:34:48 | 5550 | - | - | - | 0.725121 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
8 | gaurvar | INDIC21or-en | 2021/04/25 18:48:40 | 5561 | - | - | - | 0.718668 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
9 | gaurvar | INDIC21or-en | 2021/04/25 19:01:23 | 5571 | - | - | - | 0.721531 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
10 | sakura | INDIC21or-en | 2021/04/30 22:55:20 | 5876 | - | - | - | 0.808239 | - | - | - | - | - | - | NMT | No | Fine-tuning of multilingual mBART many2many model with training corpus.
|
11 | IIIT-H | INDIC21or-en | 2021/05/03 18:15:02 | 6022 | - | - | - | 0.804930 | - | - | - | - | - | - | NMT | No | MNMT system (XX-En) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
12 | CFILT | INDIC21or-en | 2021/05/04 01:16:29 | 6058 | - | - | - | 0.793769 | - | - | - | - | - | - | NMT | No | Multilingual(Many-to-One(XX-En)) NMT model based on Transformer with shared encoder and decoder. |
13 | coastal | INDIC21or-en | 2021/05/04 01:46:05 | 6107 | - | - | - | 0.727657 | - | - | - | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
14 | CFILT-IITB | INDIC21or-en | 2021/05/04 01:54:16 | 6119 | - | - | - | 0.770941 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all indic language data converted to same script |
15 | CFILT-IITB | INDIC21or-en | 2021/05/04 01:59:27 | 6128 | - | - | - | 0.780009 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all Indo-Aryan languages data converted to same script |
16 | sakura | INDIC21or-en | 2021/05/04 13:19:36 | 6208 | - | - | - | 0.805953 | - | - | - | - | - | - | NMT | No | Pre-training multilingual mBART many2many model with training corpus followed by finetuning on PMI Parallel.
|
17 | SRPOL | INDIC21or-en | 2021/05/04 15:27:42 | 6248 | - | - | - | 0.817318 | - | - | - | - | - | - | NMT | No | Ensemble of many-to-one on all data. Pretrained on BT, finetuned on PMI |
18 | SRPOL | INDIC21or-en | 2021/05/04 16:32:35 | 6274 | - | - | - | 0.814871 | - | - | - | - | - | - | NMT | No | Many-to-one on all data. Pretrained on BT, finetuned on PMI |
19 | IITP-MT | INDIC21or-en | 2021/05/04 18:07:20 | 6294 | - | - | - | 0.803226 | - | - | - | - | - | - | NMT | No | Many-to-One model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
20 | mcairt | INDIC21or-en | 2021/05/04 19:21:48 | 6338 | - | - | - | 0.795586 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
21 | NICT-5 | INDIC21or-en | 2021/06/21 12:04:38 | 6478 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on am mbart model trained for over 5 epochs. |
22 | NICT-5 | INDIC21or-en | 2021/06/25 11:49:54 | 6499 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |