# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | MMCHTEXT24en-bn | 2022/07/06 17:12:40 | 6704 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Transformer base |
2 | CNLP-NITS-PP | MMCHTEXT24en-bn | 2022/07/11 12:58:35 | 6745 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Transliteration-based phrase pairs augmentation in training using BRNN-based NMT |
3 | SILO_NLP | MMCHTEXT24en-bn | 2022/07/12 17:28:09 | 6843 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Fine-tuning with pre-trained mBART-50 model |
4 | nlp_novices | MMCHTEXT24en-bn | 2022/07/18 20:48:02 | 6970 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | Yes | Finetuned Transformers |
5 | ODIAGEN | MMCHTEXT24en-bn | 2023/07/03 13:42:15 | 7090 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Fine-tuning Transformer using NLLB-200 from Facebook |
6 | 00-7 | MMCHTEXT24en-bn | 2024/08/11 13:12:20 | 7321 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | Yes | |
7 | ODIAGEN | MMCHTEXT24en-bn | 2024/08/11 19:49:54 | 7336 | - | - | - | - | - | - | 0.000000 | - | - | - | Other | No | LLM based (Mistral-7B fine-tuning) |