# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | ORGANIZER | MMEVTEXT23en-bn | 2022/07/06 17:11:37 | 6703 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Transformer base model |
2 | CNLP-NITS-PP | MMEVTEXT23en-bn | 2022/07/11 13:01:14 | 6746 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Transliteration-based phrase pairs augmentation in training using BRNN-based NMT |
3 | SILO_NLP | MMEVTEXT23en-bn | 2022/07/18 13:20:11 | 6954 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Transformer base + additional data (BNLIT) |
4 | nlp_novices | MMEVTEXT23en-bn | 2022/07/18 21:09:29 | 6971 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | Yes | Finetuned Transformers |
5 | ODIAGEN | MMEVTEXT23en-bn | 2023/07/03 13:39:53 | 7089 | - | - | - | - | - | - | 0.000000 | - | - | - | NMT | No | Fine-tuning Transformer using NLLB-200 from Facebook |