NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1NICT-4ALTen-my2018/09/13 14:51:402294-------32.30 0.00 0.00OtherYesMany PBSMT and NMT n-best lists combined and reranked. Use noisy Wikipedia data for back-translation and language model trainings.
2NICT-4ALTen-my2018/09/13 14:20:192287-------30.52 0.00 0.00OtherNoMany PBSMT and NMT n-best lists combined and reranked
3NICTALTen-my2018/09/14 18:07:142345-------29.89 0.00 0.00NMTNo4 model ensemble
4NICT-4ALTen-my2018/08/24 11:07:362087-------29.57 0.00 0.00NMTNoEnsemble of 4 models
5NICT-4ALTen-my2018/08/24 10:36:462084-------27.61 0.00 0.00NMTNosingle model
6NICTALTen-my2018/09/12 15:45:562282-------26.02 0.00 0.00NMTNoSingle model
7NICT-4ALTen-my2018/09/13 14:24:472288-------23.14 0.00 0.00SMTNowith MSLR models, language models were trained on the target side of the parallel data
8UCSYNLPALTen-my2018/09/14 17:18:592341-------22.78 0.00 0.00SMTNoOSM
9XMUNLPALTen-my2018/09/15 16:40:172398-------22.76 0.00 0.00NMTNosingle rnnsearch model
10ORGANIZERALTen-my2018/09/04 18:37:122227-------22.42 0.00 0.00NMTNoNMT with Attention
11UCSYNLPALTen-my2018/09/14 17:52:122344-------22.40 0.00 0.00SMTNoPBSMT
12Osaka-UALTen-my2018/09/15 22:57:162437-------22.33 0.00 0.00NMTYesrewarding model
13XMUNLPALTen-my2018/09/16 08:45:012455-------21.57 0.00 0.00NMTNosingle transformer model
14UCSYNLPALTen-my2018/09/14 15:54:512339-------21.19 0.00 0.00NMTNoTransformer
15Osaka-UALTen-my2018/09/16 13:08:362471-------20.88 0.00 0.00SMTNopreordering with neural network
16ORGANIZERALTen-my2018/08/24 15:32:012143-------20.83 0.00 0.00OtherYesOnline A (comma -> 0x104a)
17ORGANIZERALTen-my2018/08/24 15:31:022142-------20.31 0.00 0.00OtherYesOnline A
18kmust88ALTen-my2018/09/15 00:12:282360-------19.34 0.00 0.00NMTNotraining the model base on transformer and do some
19UCSYNLPALTen-my2018/09/14 15:56:002340-------19.19 0.00 0.00NMTNoNMT with Attention
20UCSMNLPALTen-my2018/10/29 15:32:372550-------19.17 0.00 0.00SMTNoBatch MIRA tuning
21Osaka-UALTen-my2018/09/16 11:55:072462------- 9.45 0.00 0.00NMTNomixed fine tuning
22UCSMNLPALTen-my2018/09/14 15:27:262337------- 8.16 0.00 0.00SMTNowith PBSMT

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1NICT-4ALTen-my2018/09/13 14:51:402294-------0.7464800.0000000.000000OtherYesMany PBSMT and NMT n-best lists combined and reranked. Use noisy Wikipedia data for back-translation and language model trainings.
2NICT-4ALTen-my2018/08/24 11:07:362087-------0.7385380.0000000.000000NMTNoEnsemble of 4 models
3NICT-4ALTen-my2018/09/13 14:20:192287-------0.7335010.0000000.000000OtherNoMany PBSMT and NMT n-best lists combined and reranked
4NICTALTen-my2018/09/14 18:07:142345-------0.7269220.0000000.000000NMTNo4 model ensemble
5NICT-4ALTen-my2018/08/24 10:36:462084-------0.7251030.0000000.000000NMTNosingle model
6NICTALTen-my2018/09/12 15:45:562282-------0.6946520.0000000.000000NMTNoSingle model
7ORGANIZERALTen-my2018/08/24 15:32:012143-------0.6799680.0000000.000000OtherYesOnline A (comma -> 0x104a)
8UCSYNLPALTen-my2018/09/14 15:54:512339-------0.6798000.0000000.000000NMTNoTransformer
9ORGANIZERALTen-my2018/08/24 15:31:022142-------0.6783600.0000000.000000OtherYesOnline A
10XMUNLPALTen-my2018/09/15 16:40:172398-------0.6744550.0000000.000000NMTNosingle rnnsearch model
11UCSYNLPALTen-my2018/09/14 15:56:002340-------0.6714610.0000000.000000NMTNoNMT with Attention
12Osaka-UALTen-my2018/09/15 22:57:162437-------0.6685960.0000000.000000NMTYesrewarding model
13XMUNLPALTen-my2018/09/16 08:45:012455-------0.6682100.0000000.000000NMTNosingle transformer model
14ORGANIZERALTen-my2018/09/04 18:37:122227-------0.6674370.0000000.000000NMTNoNMT with Attention
15kmust88ALTen-my2018/09/15 00:12:282360-------0.6507960.0000000.000000NMTNotraining the model base on transformer and do some
16Osaka-UALTen-my2018/09/16 13:08:362471-------0.6395170.0000000.000000SMTNopreordering with neural network
17Osaka-UALTen-my2018/09/16 11:55:072462-------0.5819030.0000000.000000NMTNomixed fine tuning
18NICT-4ALTen-my2018/09/13 14:24:472288-------0.5652430.0000000.000000SMTNowith MSLR models, language models were trained on the target side of the parallel data
19UCSMNLPALTen-my2018/10/29 15:32:372550-------0.5540230.0000000.000000SMTNoBatch MIRA tuning
20UCSYNLPALTen-my2018/09/14 17:18:592341-------0.5498830.0000000.000000SMTNoOSM
21UCSYNLPALTen-my2018/09/14 17:52:122344-------0.5443950.0000000.000000SMTNoPBSMT
22UCSMNLPALTen-my2018/09/14 15:27:262337-------0.4707580.0000000.000000SMTNowith PBSMT

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1NICT-4ALTen-my2018/09/13 14:51:402294-------0.8164800.0000000.000000OtherYesMany PBSMT and NMT n-best lists combined and reranked. Use noisy Wikipedia data for back-translation and language model trainings.
2NICT-4ALTen-my2018/09/13 14:20:192287-------0.8097500.0000000.000000OtherNoMany PBSMT and NMT n-best lists combined and reranked
3NICT-4ALTen-my2018/08/24 11:07:362087-------0.8038100.0000000.000000NMTNoEnsemble of 4 models
4NICTALTen-my2018/09/14 18:07:142345-------0.8002300.0000000.000000NMTNo4 model ensemble
5NICT-4ALTen-my2018/08/24 10:36:462084-------0.7990000.0000000.000000NMTNosingle model
6NICTALTen-my2018/09/12 15:45:562282-------0.7859200.0000000.000000NMTNoSingle model
7Osaka-UALTen-my2018/09/16 13:08:362471-------0.7747500.0000000.000000SMTNopreordering with neural network
8XMUNLPALTen-my2018/09/16 08:45:012455-------0.7721200.0000000.000000NMTNosingle transformer model
9UCSYNLPALTen-my2018/09/14 15:54:512339-------0.7567100.0000000.000000NMTNoTransformer
10UCSYNLPALTen-my2018/09/14 17:18:592341-------0.7511800.0000000.000000SMTNoOSM
11UCSYNLPALTen-my2018/09/14 17:52:122344-------0.7490800.0000000.000000SMTNoPBSMT
12XMUNLPALTen-my2018/09/15 16:40:172398-------0.7489400.0000000.000000NMTNosingle rnnsearch model
13ORGANIZERALTen-my2018/09/04 18:37:122227-------0.7455500.0000000.000000NMTNoNMT with Attention
14Osaka-UALTen-my2018/09/15 22:57:162437-------0.7407600.0000000.000000NMTYesrewarding model
15kmust88ALTen-my2018/09/15 00:12:282360-------0.7212800.0000000.000000NMTNotraining the model base on transformer and do some
16UCSYNLPALTen-my2018/09/14 15:56:002340-------0.7174800.0000000.000000NMTNoNMT with Attention
17Osaka-UALTen-my2018/09/16 11:55:072462-------0.6653600.0000000.000000NMTNomixed fine tuning
18NICT-4ALTen-my2018/09/13 14:24:472288-------0.6183800.0000000.000000SMTNowith MSLR models, language models were trained on the target side of the parallel data
19UCSMNLPALTen-my2018/10/29 15:32:372550-------0.6152400.0000000.000000SMTNoBatch MIRA tuning
20ORGANIZERALTen-my2018/08/24 15:32:012143-------0.5942300.0000000.000000OtherYesOnline A (comma -> 0x104a)
21ORGANIZERALTen-my2018/08/24 15:31:022142-------0.5871200.0000000.000000OtherYesOnline A
22UCSMNLPALTen-my2018/09/14 15:27:262337-------0.2225100.0000000.000000SMTNowith PBSMT

Notice:

Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NICTALTen-my2018/09/14 18:07:14234561.000NMTNo4 model ensemble
2NICT-4ALTen-my2018/08/24 11:07:36208753.000NMTNoEnsemble of 4 models
3NICTALTen-my2018/09/12 15:45:56228242.500NMTNoSingle model
4NICT-4ALTen-my2018/09/13 14:20:19228739.750OtherNoMany PBSMT and NMT n-best lists combined and reranked
5UCSYNLPALTen-my2018/09/14 15:54:51233910.500NMTNoTransformer
6kmust88ALTen-my2018/09/15 00:12:2823609.750NMTNotraining the model base on transformer and do some
7Osaka-UALTen-my2018/09/15 22:57:1624373.000NMTYesrewarding model
8UCSYNLPALTen-my2018/09/14 15:56:0023400.750NMTNoNMT with Attention
9Osaka-UALTen-my2018/09/16 13:08:362471-23.500SMTNopreordering with neural network
10UCSMNLPALTen-my2018/09/14 15:27:262337-96.750SMTNowith PBSMT

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02