# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | NICT-5 | ALTmy-en | 2018/08/22 18:57:56 | 2056 | - | - | - | 0.579520 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Simple Mixed Fine Tuning model using transformer. |
2 | NICT-4 | ALTmy-en | 2018/08/23 10:28:04 | 2068 | - | - | - | 0.589310 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT baseline: single system |
3 | NICT-4 | ALTmy-en | 2018/08/23 10:29:46 | 2069 | - | - | - | 0.586770 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT baseline: ensemble |
4 | NICT-4 | ALTmy-en | 2018/08/23 10:39:02 | 2071 | - | - | - | 0.569370 | - | - | - | - | 0.000000 | 0.000000 | SMT | Yes | MSLR, with language model trained on common-crawl data. |
5 | ORGANIZER | ALTmy-en | 2018/08/24 15:29:41 | 2141 | - | - | - | 0.576780 | - | - | - | - | 0.000000 | 0.000000 | Other | Yes | Online A |
6 | ORGANIZER | ALTmy-en | 2018/09/04 18:38:58 | 2228 | - | - | - | 0.525950 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
7 | NICT | ALTmy-en | 2018/09/12 15:33:34 | 2281 | - | - | - | 0.589020 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | Single model |
8 | NICT-4 | ALTmy-en | 2018/09/13 14:31:16 | 2290 | - | - | - | 0.582230 | - | - | - | - | 0.000000 | 0.000000 | Other | No | Many PBSMT and NMT n-best lists combined and reranked |
9 | NICT-4 | ALTmy-en | 2018/09/13 14:33:29 | 2291 | - | - | - | 0.584040 | - | - | - | - | 0.000000 | 0.000000 | SMT | No | with MSLR models, language models were trained on the target side of the parallel data |
10 | NICT-4 | ALTmy-en | 2018/09/13 15:36:40 | 2303 | - | - | - | 0.655910 | - | - | - | - | 0.000000 | 0.000000 | Other | Yes | Many PBSMT and NMT n-best lists combined and reranked. Use monolingual data for back-translation and language model trainings. |
11 | NICT | ALTmy-en | 2018/09/14 10:13:17 | 2329 | - | - | - | 0.580690 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | 4 models ensemble |
12 | UCSYNLP | ALTmy-en | 2018/09/14 13:22:24 | 2332 | - | - | - | 0.518990 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
13 | UCSMNLP | ALTmy-en | 2018/09/14 15:32:10 | 2338 | - | - | - | 0.354550 | - | - | - | - | 0.000000 | 0.000000 | SMT | No | with PBSMT |
14 | UCSYNLP | ALTmy-en | 2018/09/15 15:25:27 | 2391 | - | - | - | 0.594800 | - | - | - | - | 0.000000 | 0.000000 | SMT | No | OSM |
15 | UCSYNLP | ALTmy-en | 2018/09/15 15:44:56 | 2393 | - | - | - | 0.560800 | - | - | - | - | 0.000000 | 0.000000 | SMT | No | HPBSMT |
16 | XMUNLP | ALTmy-en | 2018/09/15 16:42:12 | 2399 | - | - | - | 0.500210 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | single rnnsearch model |
17 | Osaka-U | ALTmy-en | 2018/09/15 22:59:06 | 2438 | - | - | - | 0.510900 | - | - | - | - | 0.000000 | 0.000000 | NMT | Yes | rewarding model |
18 | XMUNLP | ALTmy-en | 2018/09/16 08:46:40 | 2456 | - | - | - | 0.543700 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | single transformer model |
19 | Osaka-U | ALTmy-en | 2018/09/16 11:56:52 | 2463 | - | - | - | 0.552040 | - | - | - | - | 0.000000 | 0.000000 | NMT | No | mixed fine tuning |
20 | UCSMNLP | ALTmy-en | 2018/10/29 15:28:57 | 2549 | - | - | - | 0.552430 | - | - | - | - | 0.000000 | 0.000000 | SMT | No | Batch MIRA tuning |