# |
Team |
Task |
Date/Time |
DataID |
RIBES |
Method
|
Other Resources
|
System Description |
juman |
kytea |
mecab |
moses- tokenizer |
stanford- segmenter- ctb |
stanford- segmenter- pku |
indic- tokenizer |
unuse |
myseg |
kmseg |
|
1 | WT | HINDENhi-en | 2020/09/03 18:12:32 | 3638 | - | - | - | 0.792065 | - | - | - | - | - | - | NMT | No | Used 5M Back translation news crawl data to train. Method: Transformer NMT; Preprocessing: 1. Removed mixed language sentences 2. moses tokeniser for English and for Hindi indicnlp normaliser and toke |
2 | cvit | HINDENhi-en | 2020/08/18 05:27:08 | 3446 | - | - | - | 0.777445 | - | - | - | - | - | - | NMT | Yes | |
3 | cvit | HINDENhi-en | 2020/07/10 04:28:17 | 3434 | - | - | - | 0.775515 | - | - | - | - | - | - | NMT | Yes | xx-to-en model uses PIB-v2 data |
4 | cvit | HINDENhi-en | 2020/07/20 20:38:09 | 3441 | - | - | - | 0.774830 | - | - | - | - | - | - | NMT | Yes | xx-en model, uses PIB-v2 data |
5 | cvit | HINDENhi-en | 2020/07/06 02:29:11 | 3419 | - | - | - | 0.774354 | - | - | - | - | - | - | NMT | Yes | XX-to-EN Model, uses PIB-V1 Data |
6 | cvit | HINDENhi-en | 2020/06/10 15:37:12 | 3418 | - | - | - | 0.770450 | - | - | - | - | - | - | NMT | Yes | XX-to-EN model, uses PIB-V0 dataset |
7 | cvit | HINDENhi-en | 2019/05/27 16:04:36 | 2681 | - | - | - | 0.768324 | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | massive-multi + bt |
8 | cvit | HINDENhi-en | 2020/07/06 06:38:14 | 3423 | - | - | - | 0.766637 | - | - | - | - | - | - | NMT | Yes | Multilingual Model, Uses PIB-V0 data.
(mm-all-iter1) |
9 | cvit | HINDENhi-en | 2019/03/15 01:23:17 | 2643 | - | - | - | 0.766180 | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | massive-multi |
10 | cvit | HINDENhi-en | 2020/07/06 06:22:02 | 3422 | - | - | - | 0.763418 | - | - | - | - | - | - | NMT | Yes | Multilingual model, mm-all-iter0 |
11 | cvit | HINDENhi-en | 2019/03/15 01:33:22 | 2645 | - | - | - | 0.758910 | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | massive-multi + ft |
12 | cvit | HINDENhi-en | 2018/11/06 15:51:54 | 2563 | - | - | - | 0.755941 | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | |
13 | cvit | HINDENhi-en | 2018/09/14 13:21:46 | 2331 | - | - | - | 0.751883 | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | ConvS2S Model
Uses External Data |
14 | XMUNLP | HINDENhi-en | 2017/07/27 23:00:46 | 1511 | - | - | - | 0.750921 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | ensemble of 4 nmt models + monolingual data |
15 | XMUNLP | HINDENhi-en | 2017/07/26 22:54:46 | 1488 | - | - | - | 0.743656 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | single nmt model + monolingual data |
16 | NICT-5 | HINDENhi-en | 2019/07/23 17:36:36 | 2865 | - | - | - | 0.741197 | - | - | - | - | - | - | NMT | No | HiEn and TaEn mixed training NMT model. Transformer on t2t
(Hi-En is external data) |
17 | LTRC-MT | HINDENhi-en | 2019/07/27 04:49:36 | 3119 | - | - | - | 0.735358 | - | - | - | - | - | - | NMT | No | Transformer Model with Backtranslation |
18 | LTRC-MT | HINDENhi-en | 2019/07/27 05:34:14 | 3121 | - | - | - | 0.735357 | - | - | - | - | - | - | NMT | No | LSTM with attention, Backtranslation, Reinforcement Learning for 1 epoch |
19 | CUNI | HINDENhi-en | 2018/09/15 03:10:30 | 2381 | - | - | - | 0.731727 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Transformer big, transfer learning from CS-EN 1M steps, only original HI-EN, beam=8; alpha=0.8; averaging of last 8 models after 230k steps |
20 | LTRC-MT | HINDENhi-en | 2019/07/27 04:04:09 | 3117 | - | - | - | 0.729072 | - | - | - | - | - | - | NMT | No | Transformer Baseline, Only IIT-B data |
21 | LTRC-MT | HINDENhi-en | 2019/07/27 05:58:33 | 3124 | - | - | - | 0.729059 | - | - | - | - | - | - | NMT | No | LSTM with global attention & Backtranslation |
22 | ORGANIZER | HINDENhi-en | 2018/11/13 14:57:12 | 2567 | - | - | - | 0.718751 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
23 | ORGANIZER | HINDENhi-en | 2016/07/26 10:04:53 | 1031 | - | - | - | 0.714537 | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online A (2016) |
24 | XMUNLP | HINDENhi-en | 2017/07/24 08:47:29 | 1427 | - | - | - | 0.697707 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | single nmt model |
25 | ORGANIZER | HINDENhi-en | 2016/07/26 13:25:18 | 1048 | - | - | - | 0.683214 | - | - | - | 0.000000 | 0.000000 | 0.000000 | Other | Yes | Online B (2016) |
26 | IITB-MTG | HINDENhi-en | 2017/08/01 15:10:09 | 1726 | - | - | - | 0.682902 | - | - | - | 0.000000 | 0.000000 | 0.000000 | NMT | No | NMT with ensemble (last 3 + best validation) |
27 | cvit | HINDENhi-en | 2019/03/22 05:52:47 | 2658 | - | - | - | 0.667353 | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | many to en (Transformer model) trained on WAT2018 data. Detokenized! |
28 | ORGANIZER | HINDENhi-en | 2016/07/26 15:44:20 | 1054 | - | - | - | 0.638090 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Phrase-based SMT |
29 | IITP-MT | HINDENhi-en | 2016/08/29 15:10:41 | 1289 | - | - | - | 0.628666 | - | - | - | 0.000000 | 0.000000 | 0.000000 | SMT | No | Hierarchical SMT |