# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | INDIC21bn-en | 2021/04/08 17:18:57 | 4789 | - | - | - | 0.613093 | - | - | - | - | - | - | NMT | No | Bilingual baseline trained on PMI data. Transformer base. LR=10-3 |
2 | NICT-5 | INDIC21bn-en | 2021/04/21 15:40:18 | 5274 | - | - | - | 0.744400 | - | - | - | - | - | - | NMT | No | Pretrain MBART on IndicCorp and FT on bilingual PMI data. Beam search. Model is bilingual. |
3 | SRPOL | INDIC21bn-en | 2021/04/21 19:30:14 | 5325 | - | - | - | 0.769864 | - | - | - | - | - | - | NMT | No | Base transformer on all WAT21 data |
4 | NICT-5 | INDIC21bn-en | 2021/04/22 11:50:48 | 5349 | - | - | - | 0.758921 | - | - | - | - | - | - | NMT | No | MBART+MNMT. Beam 4. |
5 | gaurvar | INDIC21bn-en | 2021/04/25 18:09:37 | 5533 | - | - | - | 0.660262 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
6 | gaurvar | INDIC21bn-en | 2021/04/25 18:26:00 | 5542 | - | - | - | 0.660707 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
7 | gaurvar | INDIC21bn-en | 2021/04/25 18:28:50 | 5544 | - | - | - | 0.669440 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
8 | gaurvar | INDIC21bn-en | 2021/04/25 18:42:42 | 5556 | - | - | - | 0.673457 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
9 | gaurvar | INDIC21bn-en | 2021/04/25 18:55:51 | 5565 | - | - | - | 0.674034 | - | - | - | - | - | - | NMT | No | Multi Task Multi Lingual T5 trained for Multiple Indic Languages |
10 | sakura | INDIC21bn-en | 2021/04/30 22:31:37 | 5870 | - | - | - | 0.772365 | - | - | - | - | - | - | NMT | No | Fine-tuning of multilingual mBART many2many model with training corpus.
|
11 | IIIT-H | INDIC21bn-en | 2021/05/03 18:12:09 | 6015 | - | - | - | 0.773292 | - | - | - | - | - | - | NMT | No | MNMT system (XX-En) trained via exploiting lexical similarity on PMI+CVIT parallel corpus, then improved using back translation on PMI monolingual data followed by fine tuning. |
12 | mcairt | INDIC21bn-en | 2021/05/03 20:10:04 | 6026 | - | - | - | 0.778620 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
13 | CFILT | INDIC21bn-en | 2021/05/04 01:09:38 | 6052 | - | - | - | 0.766461 | - | - | - | - | - | - | NMT | No | Multilingual(Many-to-One(XX-En)) NMT model based on Transformer with shared encoder and decoder. |
14 | coastal | INDIC21bn-en | 2021/05/04 01:43:08 | 6094 | - | - | - | 0.752395 | - | - | - | - | - | - | NMT | No | seq2seq model trained on all WAT2021 data |
15 | CFILT-IITB | INDIC21bn-en | 2021/05/04 01:52:00 | 6112 | - | - | - | 0.730379 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all indic language data converted to same script |
16 | CFILT-IITB | INDIC21bn-en | 2021/05/04 01:56:40 | 6124 | - | - | - | 0.734491 | - | - | - | - | - | - | NMT | No | Multilingual NMT (Many to One): Transformer based model with shared encoder-decoder and shared BPE vocabulary trained using all Indo-Aryan languages data converted to same script |
17 | coastal | INDIC21bn-en | 2021/05/04 05:40:41 | 6162 | - | - | - | 0.778356 | - | - | - | - | - | - | NMT | No | mT5 trained only on PMI |
18 | sakura | INDIC21bn-en | 2021/05/04 13:08:42 | 6202 | - | - | - | 0.772925 | - | - | - | - | - | - | NMT | No | Pre-training multilingual mBART many2many model with training corpus followed by finetuning on PMI Parallel.
|
19 | SRPOL | INDIC21bn-en | 2021/05/04 15:23:57 | 6242 | - | - | - | 0.789735 | - | - | - | - | - | - | NMT | No | Ensemble of many-to-one on all data. Pretrained on BT, finetuned on PMI |
20 | SRPOL | INDIC21bn-en | 2021/05/04 16:29:22 | 6268 | - | - | - | 0.792364 | - | - | - | - | - | - | NMT | No | Many-to-one on all data. Pretrained on BT, finetuned on PMI |
21 | IITP-MT | INDIC21bn-en | 2021/05/04 17:37:47 | 6280 | - | - | - | 0.777377 | - | - | - | - | - | - | NMT | No | Many-to-One model trained on all training data with base Transformer. All indic language data is romanized. Model fine-tuned on BT PMI monolingual corpus. |
22 | mcairt | INDIC21bn-en | 2021/05/04 19:06:49 | 6332 | - | - | - | 0.786717 | - | - | - | - | - | - | NMT | No | multilingual model(many to one) trained on all WAT 2021 data by using base transformer. |
23 | NICT-5 | INDIC21bn-en | 2021/06/25 11:46:56 | 6493 | - | - | - | 0.000000 | - | - | - | - | - | - | NMT | No | Using PMI and PIB data for fine-tuning on a mbart model trained for over 5 epochs. MNMT model. |