# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | CNLP-NITS | MMEVMM24en-hi | 2020/09/18 15:59:02 | 3896 | - | - | - | - | - | - | 0.820980 | - | - | - | NMT | Yes | Pretrained monolingual data (IITB) using Glove and fine-tuned with parallel data and visual features in training using BRNN encoder and doubly-attentive-rnn decoder. |
2 | ORGANIZER | MMEVMM24en-hi | 2020/11/07 01:49:40 | 4182 | - | - | - | - | - | - | 0.691950 | - | - | - | NMT | No | |
3 | ORGANIZER | MMEVMM24en-hi | 2020/11/07 01:54:46 | 4183 | - | - | - | - | - | - | 0.772870 | - | - | - | NMT | No | |
4 | CNLP-NITS-PP | MMEVMM24en-hi | 2021/04/28 00:05:19 | 5731 | - | - | - | - | - | - | 0.641430 | - | - | - | NMT | Yes | Pretrained monolingual data (IITB) using Glove and fine-tuned with parallel data (WAT21 train data+ Extracted Phrase pairs from WAT21 train data +IITB train data) and visual features in training using |
5 | iitp | MMEVMM24en-hi | 2021/05/01 23:19:14 | 5941 | - | - | - | - | - | - | 0.629444 | - | - | - | NMT | No | Removed special chars at start and end of sentnce
1. pre-trained with HindEnCorp, trained with Vis Gen
2. trained with Visual Genome
Selected best of two for each sentence according to translation |
6 | Volta | MMEVMM24en-hi | 2021/05/25 13:55:52 | 6428 | - | - | - | - | - | - | 0.839100 | - | - | - | NMT | Yes | Finetuned mBART (Used IITB for data augmentation) and added object tags to the input using Mask RCNN |
7 | CNLP-NITS-PP | MMEVMM24en-hi | 2022/07/11 12:31:32 | 6740 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Transliteration-based phrase pairs augmentation and visual features in training using BRNN encoder and doubly-attentive-rnn decoder. |
8 | SILO_NLP | MMEVMM24en-hi | 2022/07/18 17:31:41 | 6958 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | Yes | Object Tags (Image) + Flickr8 dataset as additional resource + Finetune mBART |
9 | ODIAGEN | MMEVMM24en-hi | 2023/07/06 03:45:52 | 7105 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Image features extracted as Object tags appended with text and MBART fine-tuning |
10 | BITS-P | MMEVMM24en-hi | 2023/07/08 13:46:53 | 7125 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | Yes | NLLB model finetuned on captions + object tags of original & synthetic images using DETR model |
11 | 00-7 | MMEVMM24en-hi | 2024/08/11 13:50:49 | 7325 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | |
12 | v036 | MMEVMM24en-hi | 2024/08/11 21:39:07 | 7345 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | NMT based system using both image descriptors and text description. A multistage LLM pipeline used for extracting image data descriptions and translation. Fine tuning done in few cases
Models Used:
|
13 | DCU_NMT | MMEVMM24en-hi | 2024/08/11 23:03:33 | 7351 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Context-aware model that uses image caption data extracted from images as context. |
14 | DCU_NMT | MMEVMM24en-hi | 2024/08/13 03:24:09 | 7371 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | NMT system trained on constrained resources using bert encoded context extracted from visual representation of training data. The context is used only on source side. |
15 | 239233 | MMEVMM24en-hi | 2024/08/13 07:30:13 | 7379 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | Yes | One-shot prompt for synthetic QA description from captions; translate QA using IndicTrans2; generate caption from QA as context |
16 | UNLP | MMEVMM24en-hi | 2024/08/13 17:57:39 | 7392 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Using the Transformer-based Gated Fusion model to integrate both text and visual data.
|
17 | v036 | MMEVMM24en-hi | 2024/08/14 20:34:40 | 7398 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | |
18 | v036 | MMEVMM24en-hi | 2024/08/15 18:52:02 | 7408 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | |
19 | v036 | MMEVMM24en-hi | 2024/08/15 19:27:23 | 7411 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | |