# |
Team |
Task |
Date/Time |
DataID |
AMFM |
Method
|
Other Resources
|
System Description |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
unuse |
|
1 | NICT-4 | ALTen-my | 2018/09/13 14:51:40 | 2294 | - | - | - | - | - | - | - | 0.816480 | 0.000000 | 0.000000 | Other | Yes | Many PBSMT and NMT n-best lists combined and reranked. Use noisy Wikipedia data for back-translation and language model trainings. |
2 | NICT-4 | ALTen-my | 2018/09/13 14:20:19 | 2287 | - | - | - | - | - | - | - | 0.809750 | 0.000000 | 0.000000 | Other | No | Many PBSMT and NMT n-best lists combined and reranked |
3 | NICT-4 | ALTen-my | 2018/08/24 11:07:36 | 2087 | - | - | - | - | - | - | - | 0.803810 | 0.000000 | 0.000000 | NMT | No | Ensemble of 4 models |
4 | NICT | ALTen-my | 2018/09/14 18:07:14 | 2345 | - | - | - | - | - | - | - | 0.800230 | 0.000000 | 0.000000 | NMT | No | 4 model ensemble |
5 | NICT-4 | ALTen-my | 2018/08/24 10:36:46 | 2084 | - | - | - | - | - | - | - | 0.799000 | 0.000000 | 0.000000 | NMT | No | single model |
6 | NICT | ALTen-my | 2018/09/12 15:45:56 | 2282 | - | - | - | - | - | - | - | 0.785920 | 0.000000 | 0.000000 | NMT | No | Single model |
7 | Osaka-U | ALTen-my | 2018/09/16 13:08:36 | 2471 | - | - | - | - | - | - | - | 0.774750 | 0.000000 | 0.000000 | SMT | No | preordering with neural network |
8 | XMUNLP | ALTen-my | 2018/09/16 08:45:01 | 2455 | - | - | - | - | - | - | - | 0.772120 | 0.000000 | 0.000000 | NMT | No | single transformer model |
9 | UCSYNLP | ALTen-my | 2018/09/14 15:54:51 | 2339 | - | - | - | - | - | - | - | 0.756710 | 0.000000 | 0.000000 | NMT | No | Transformer |
10 | UCSYNLP | ALTen-my | 2018/09/14 17:18:59 | 2341 | - | - | - | - | - | - | - | 0.751180 | 0.000000 | 0.000000 | SMT | No | OSM |
11 | UCSYNLP | ALTen-my | 2018/09/14 17:52:12 | 2344 | - | - | - | - | - | - | - | 0.749080 | 0.000000 | 0.000000 | SMT | No | PBSMT |
12 | XMUNLP | ALTen-my | 2018/09/15 16:40:17 | 2398 | - | - | - | - | - | - | - | 0.748940 | 0.000000 | 0.000000 | NMT | No | single rnnsearch model |
13 | ORGANIZER | ALTen-my | 2018/09/04 18:37:12 | 2227 | - | - | - | - | - | - | - | 0.745550 | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
14 | Osaka-U | ALTen-my | 2018/09/15 22:57:16 | 2437 | - | - | - | - | - | - | - | 0.740760 | 0.000000 | 0.000000 | NMT | Yes | rewarding model |
15 | kmust88 | ALTen-my | 2018/09/15 00:12:28 | 2360 | - | - | - | - | - | - | - | 0.721280 | 0.000000 | 0.000000 | NMT | No | training the model base on transformer and do some |
16 | UCSYNLP | ALTen-my | 2018/09/14 15:56:00 | 2340 | - | - | - | - | - | - | - | 0.717480 | 0.000000 | 0.000000 | NMT | No | NMT with Attention |
17 | Osaka-U | ALTen-my | 2018/09/16 11:55:07 | 2462 | - | - | - | - | - | - | - | 0.665360 | 0.000000 | 0.000000 | NMT | No | mixed fine tuning |
18 | NICT-4 | ALTen-my | 2018/09/13 14:24:47 | 2288 | - | - | - | - | - | - | - | 0.618380 | 0.000000 | 0.000000 | SMT | No | with MSLR models, language models were trained on the target side of the parallel data |
19 | UCSMNLP | ALTen-my | 2018/10/29 15:32:37 | 2550 | - | - | - | - | - | - | - | 0.615240 | 0.000000 | 0.000000 | SMT | No | Batch MIRA tuning |
20 | ORGANIZER | ALTen-my | 2018/08/24 15:32:01 | 2143 | - | - | - | - | - | - | - | 0.594230 | 0.000000 | 0.000000 | Other | Yes | Online A (comma -> 0x104a) |
21 | ORGANIZER | ALTen-my | 2018/08/24 15:31:02 | 2142 | - | - | - | - | - | - | - | 0.587120 | 0.000000 | 0.000000 | Other | Yes | Online A |
22 | UCSMNLP | ALTen-my | 2018/09/14 15:27:26 | 2337 | - | - | - | - | - | - | - | 0.222510 | 0.000000 | 0.000000 | SMT | No | with PBSMT |