NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1ORGANIZERzh-ja2014/07/11 19:47:27435.4335.9135.64--- 0.00 0.00 0.00 0.00SMTNoHierarchical Phrase-based SMT (2014)
2ORGANIZERzh-ja2014/07/11 19:54:58834.6535.1634.77--- 0.00 0.00 0.00 0.00SMTNoPhrase-based SMT
3ORGANIZERzh-ja2014/07/11 20:04:101336.5237.0736.64--- 0.00 0.00 0.00 0.00SMTNoTree-to-String SMT (2014)
4ORGANIZERzh-ja2014/07/18 11:09:123611.6313.2111.87--- 0.00 0.00 0.00 0.00OtherYesOnline A (2014)
5NAISTzh-ja2014/07/31 11:42:3112040.1141.2940.30--- 0.00 0.00 0.00 0.00SMTNoTravatar-based Forest-to-String SMT System
6NAISTzh-ja2014/08/01 17:33:0112440.2140.8240.15--- 0.00 0.00 0.00 0.00SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
7Kyoto-Uzh-ja2014/08/19 09:31:0813333.2635.0933.62--- 0.00 0.00 0.00 0.00EBMTNoUsing n-best parses and RNNLM.
8Kyoto-Uzh-ja2014/08/19 10:21:3713532.6833.3032.45--- 0.00 0.00 0.00 0.00EBMTNoOur baseline system.
9EIWAzh-ja2014/08/20 11:52:4513718.6918.3318.32--- 0.00 0.00 0.00 0.00RBMTYesRBMT plus user dictionary
10EIWAzh-ja2014/08/20 11:56:0013833.5333.7433.87--- 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with user dictionary plus SPE(statistical post editing)
11Sensezh-ja2014/08/26 15:17:4920033.6633.8633.46--- 0.00 0.00 0.00 0.00SMTNoCharacter based SMT
12ORGANIZERzh-ja2014/08/28 12:10:1321510.4811.2610.47--- 0.00 0.00 0.00 0.00OtherYesOnline B (2014)
13SAS_MTzh-ja2014/08/29 15:33:0723236.5836.2236.10--- 0.00 0.00 0.00 0.00SMTNoSyntactic reordering phrase-based SMT (SAS token tool)
14ORGANIZERzh-ja2014/08/29 18:45:03239 9.37 9.87 9.35--- 0.00 0.00 0.00 0.00RBMTNoRBMT A (2014)
15ORGANIZERzh-ja2014/08/29 18:48:29242 8.39 8.70 8.30--- 0.00 0.00 0.00 0.00RBMTNoRBMT D
16Kyoto-Uzh-ja2014/08/31 23:42:4125833.5734.4333.45--- 0.00 0.00 0.00 0.00EBMTNoOur new baseline system after several modifications.
17SAS_MTzh-ja2014/09/01 10:38:1326337.4237.6537.07--- 0.00 0.00 0.00 0.00SMTNoSyntactic reordering Hierarchical SMT (using SAS token tool)
18Kyoto-Uzh-ja2014/09/01 21:33:2326834.7535.8934.83--- 0.00 0.00 0.00 0.00EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
19WASUIPSzh-ja2014/09/17 00:43:3836927.6628.0928.20--- 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 1.0).
20WASUIPSzh-ja2014/09/17 00:46:0737030.4430.9230.86--- 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 1.0).
21WASUIPSzh-ja2014/09/17 01:03:5737431.8732.2632.26--- 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 2.1.1).
22WASUIPSzh-ja2014/09/17 01:05:3837532.1932.5532.54--- 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 2.1.1).
23WASUIPSzh-ja2014/09/17 10:07:4437927.3728.2827.43--- 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: kytea, moses: 1.0).
24WASUIPSzh-ja2014/09/17 10:10:4738027.8628.8928.00--- 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 1.0).
25WASUIPSzh-ja2014/09/17 10:24:5038332.0833.0932.18--- 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: kytea, moses: 2.1.1).
26WASUIPSzh-ja2014/09/17 10:26:4338432.4333.3632.48--- 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 2.1.1).
27WASUIPSzh-ja2014/09/17 11:03:4638732.5232.6932.47--- 0.00 0.00 0.00 0.00SMTNoOur baseline system (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
28WASUIPSzh-ja2014/09/17 12:00:4638832.6532.8132.59--- 0.00 0.00 0.00 0.00SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
29Kyoto-Uzh-ja2015/07/17 09:01:4249035.6636.7135.81--- 0.00 0.00 0.00 0.00EBMTNoWAT2015 baseline
30Kyoto-Uzh-ja2015/07/17 09:04:2249136.7637.8236.94--- 0.00 0.00 0.00 0.00EBMTNoWAT2015 baseline with reranking
31TOSHIBAzh-ja2015/07/23 15:14:5350837.4737.4437.34--- 0.00 0.00 0.00 0.00SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
32TOSHIBAzh-ja2015/07/28 16:27:3252535.8536.0235.73--- 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
33Sensezh-ja2015/07/29 07:20:2053329.2930.5229.45--- 0.00 0.00 0.00 0.00SMTNoBaseline-2015
34Kyoto-Uzh-ja2015/08/07 13:24:5559737.3038.2637.45--- 0.00 0.00 0.00 0.00EBMTNoUpdated JUMAN and added one reordering feature, w/ reranking
35TOSHIBAzh-ja2015/08/17 12:11:5266919.2419.4819.12--- 0.00 0.00 0.00 0.00RBMTYesRBMT
36EHRzh-ja2015/08/19 11:23:3672037.9038.6837.98--- 0.00 0.00 0.00 0.00SMT and RBMTYesSystem combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering. Candidate selection by language model score.
37BJTUNLPzh-ja2015/08/25 14:55:2076934.7234.8734.79--- 0.00 0.00 0.00 0.00SMTNo
38ORGANIZERzh-ja2015/08/25 18:58:0877611.5312.8211.68--- 0.00 0.00 0.00 0.00OtherYesOnline A (2015)
39NAISTzh-ja2015/08/31 08:23:3083441.7542.9541.93--- 0.00 0.00 0.00 0.00SMTNoTravatar System with NeuralMT Reranking
40NAISTzh-ja2015/08/31 08:26:3183539.3640.5139.47--- 0.00 0.00 0.00 0.00SMTNoTravatar System Baseline
41Kyoto-Uzh-ja2015/08/31 22:38:2284436.3037.2236.44--- 0.00 0.00 0.00 0.00EBMTNoKyotoEBMT system without reranking
42Kyoto-Uzh-ja2015/08/31 22:39:3684538.5339.4138.66--- 0.00 0.00 0.00 0.00EBMTNoKyotoEBMT system with bilingual RNNLM reranking
43BJTUNLPzh-ja2015/09/01 21:08:1086234.7234.8734.79--- 0.00 0.00 0.00 0.00SMTNoa dependency-to-string model for SMT
44EHRzh-ja2015/09/02 17:00:1686739.4339.9839.58--- 0.00 0.00 0.00 0.00SMTNoPhrase based SMT with preordering.
45EHRzh-ja2015/09/04 11:44:2686835.5935.5635.37--- 0.00 0.00 0.00 0.00SMT and RBMTYesRBMT with user dictionary plus SPE.
46ORGANIZERzh-ja2015/09/10 14:00:3387936.5237.0736.64--- 0.00 0.00 0.00 0.00SMTNoTree-to-String SMT (2015)
47ORGANIZERzh-ja2015/09/10 14:30:56885 9.37 9.87 9.35--- 0.00 0.00 0.00 0.00OtherYesRBMT A (2015)
48ORGANIZERzh-ja2015/09/11 10:09:2389010.4111.0310.36--- 0.00 0.00 0.00 0.00OtherYesOnline B (2015)
49ORGANIZERzh-ja2016/07/26 11:54:14104311.5612.8711.69---- 0.00 0.00 0.00OtherYesOnline A (2016)
50EHRzh-ja2016/07/31 17:06:57106339.7539.8539.40---- 0.00 0.00 0.00SMTYesLM-based merging of outputs of preordered word-based PBSMT(DL=6) and preordered character-based PBSMT(DL=6).
51NICT-2zh-ja2016/08/05 18:05:03109940.0240.4540.29---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
52Kyoto-Uzh-ja2016/08/07 18:28:23111036.6337.5436.70---- 0.00 0.00 0.00EBMTNoKyotoEBMT 2016 w/o reranking
53bjtu_nlpzh-ja2016/08/12 12:50:38113838.8339.2538.68---- 0.00 0.00 0.00NMTNoRNN Encoder-Decoder with attention mechanism, single model
54JAPIOzh-ja2016/08/19 16:44:49120826.2427.8726.37---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + JAPIO corpus + rule-based posteditor
55UT-KAYzh-ja2016/08/20 07:09:54122037.6339.0737.82---- 0.00 0.00 0.00NMTNoAn end-to-end NMT with 512 dimensional single-layer LSTMs, UNK replacement, and domain adaptation
56UT-KAYzh-ja2016/08/20 07:12:52122140.5041.8140.67---- 0.00 0.00 0.00NMTNoEnsemble of our NMT models with and without domain adaptation
57Kyoto-Uzh-ja2016/08/20 22:48:16125544.2945.0544.32---- 0.00 0.00 0.00NMTNosrc: 200k tgt: 50k 2-layers self-ensembling
58Kyoto-Uzh-ja2016/08/20 22:50:33125646.0446.7046.05---- 0.00 0.00 0.00NMTNovoc: 30k ensemble of 3 independent model + reverse rescoring
59Kyoto-Uzh-ja2016/10/11 10:46:03132446.3647.0246.50---- 0.00 0.00 0.00NMTNovoc: 32k ensemble of 4 independent model + Chinese short unit
60ORGANIZERzh-ja2016/11/16 11:28:00134218.7520.6419.04---- 0.00 0.00 0.00NMTYesOnline A (2016/11/14)
61NICT-2zh-ja2017/07/26 13:58:44147744.2644.9044.50---- 0.00 0.00 0.00NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
62NICT-2zh-ja2017/07/26 14:08:45148146.8447.5147.27---- 0.00 0.00 0.00NMTNoNMT 6 Ensembles * Bi-directional Reranking
63Kyoto-Uzh-ja2017/07/29 08:02:07157748.4348.8448.51---- 0.00 0.00 0.00NMTNoEnsemble of 5 Shared BPE 40k
64Kyoto-Uzh-ja2017/07/31 15:27:21164346.7447.7946.67---- 0.00 0.00 0.00NMTNoKW replacement without KW in the test set, BPE, 6 ensemble
65Kyoto-Uzh-ja2017/08/01 14:14:49172048.3448.7648.40---- 0.00 0.00 0.00NMTNoEnsemble of 7 shared BPE, averaged
66ORGANIZERzh-ja2017/08/02 09:59:33174046.8747.3047.00---- 0.00 0.00 0.00NMTNoGoogle's "Attention Is All You Need"
67ORGANIZERzh-ja2018/08/14 11:33:03190243.3143.5343.34----- 0.00 0.00NMTNoNMT with Attention
68NICT-5zh-ja2018/08/22 18:51:44205248.4348.7848.52----- 0.00 0.00NMTNoMixed fine tuning by first pretraining on En-Ja ASPEC data and then continue on the En-Ja+Zh-Ja data. Transformer.
69NICT-5zh-ja2018/08/27 14:40:35216949.6750.4649.79----- 0.00 0.00NMTNoCombining En-Ja corpus with Zh-Ja as a multilingual model. *ADDITIONAL ASPEC CORPUS USED*
70NICT-5zh-ja2018/09/10 14:14:05226749.7950.6649.89----- 0.00 0.00NMTNoMLNMT
71TMUzh-ja2018/09/14 17:30:332343 6.21 7.02 6.27----- 0.00 0.00NMTYesUnsupervised NMT with sub-character information. Both ASPEC and JPC 4.0 data (zh-ja) were also used as monolingual data in the training.
72srcbzh-ja2019/07/25 11:37:44291749.8350.8950.00-------NMTNoTransformer (Big) with relative position, layer attention, sentence-wise smooth.
73KNU_Hyundaizh-ja2019/07/27 10:30:04317950.0250.8450.23-------NMTNoTransformer(base) + *Used ASPEC ja-en corpus* with relative position, bt, multi source, r2l rerank, 6-model ensemble
74srcbzh-ja2019/07/27 15:48:24321052.3753.5852.57-------NMTNoTransformer(Big) with relative position, sentence-wise smooth, deep transformer, back translation, ensemble of 7 models.
75Kyoto-U+ECNUzh-ja2020/09/10 23:58:13367750.3751.2750.60-------NMTNoback-translation by using ja monolingual data from ASPEC-JE; lightconv (pay less attention) single model without ensemble
76Kyoto-U+ECNUzh-ja2020/09/17 18:41:34381352.8053.6452.92-------NMTYesensemble 9 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder)
77Kyoto-U+ECNUzh-ja2020/09/18 17:38:55393352.6553.4852.80-------NMTNowithout out-of-domain parallel data; others same as DataID:3813

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1ORGANIZERzh-ja2014/07/11 19:47:2740.8104060.7987260.807665---0.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
2ORGANIZERzh-ja2014/07/11 19:54:5880.7724980.7663840.771005---0.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
3ORGANIZERzh-ja2014/07/11 20:04:10130.8252920.8204900.825025---0.0000000.0000000.0000000.000000SMTNoTree-to-String SMT (2014)
4ORGANIZERzh-ja2014/07/18 11:09:12360.5959250.5981720.598573---0.0000000.0000000.0000000.000000OtherYesOnline A (2014)
5NAISTzh-ja2014/07/31 11:42:311200.8424770.8348240.842235---0.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
6NAISTzh-ja2014/08/01 17:33:011240.8454860.8380920.845625---0.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
7Kyoto-Uzh-ja2014/08/19 09:31:081330.7916800.7871050.791269---0.0000000.0000000.0000000.000000EBMTNoUsing n-best parses and RNNLM.
8Kyoto-Uzh-ja2014/08/19 10:21:371350.7862290.7830160.786352---0.0000000.0000000.0000000.000000EBMTNoOur baseline system.
9EIWAzh-ja2014/08/20 11:52:451370.7401830.7202810.732466---0.0000000.0000000.0000000.000000RBMTYesRBMT plus user dictionary
10EIWAzh-ja2014/08/20 11:56:001380.8113500.8005060.808504---0.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with user dictionary plus SPE(statistical post editing)
11Sensezh-ja2014/08/26 15:17:492000.7894950.7743380.784012---0.0000000.0000000.0000000.000000SMTNoCharacter based SMT
12ORGANIZERzh-ja2014/08/28 12:10:132150.6007330.5960060.600706---0.0000000.0000000.0000000.000000OtherYesOnline B (2014)
13SAS_MTzh-ja2014/08/29 15:33:072320.8221800.8075350.817368---0.0000000.0000000.0000000.000000SMTNoSyntactic reordering phrase-based SMT (SAS token tool)
14ORGANIZERzh-ja2014/08/29 18:45:032390.6662770.6524020.661730---0.0000000.0000000.0000000.000000RBMTNoRBMT A (2014)
15ORGANIZERzh-ja2014/08/29 18:48:292420.6411890.6264000.633319---0.0000000.0000000.0000000.000000RBMTNoRBMT D
16Kyoto-Uzh-ja2014/08/31 23:42:412580.8009490.7953900.800986---0.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
17SAS_MTzh-ja2014/09/01 10:38:132630.8341700.8255510.833048---0.0000000.0000000.0000000.000000SMTNoSyntactic reordering Hierarchical SMT (using SAS token tool)
18Kyoto-Uzh-ja2014/09/01 21:33:232680.8026290.7986310.802930---0.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
19WASUIPSzh-ja2014/09/17 00:43:383690.7791830.7629490.770846---0.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 1.0).
20WASUIPSzh-ja2014/09/17 00:46:073700.7898240.7731420.781475---0.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 1.0).
21WASUIPSzh-ja2014/09/17 01:03:573740.7943030.7778760.786422---0.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 2.1.1).
22WASUIPSzh-ja2014/09/17 01:05:383750.7958380.7800270.787591---0.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 2.1.1).
23WASUIPSzh-ja2014/09/17 10:07:443790.7744230.7537490.767073---0.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: kytea, moses: 1.0).
24WASUIPSzh-ja2014/09/17 10:10:473800.7765500.7567210.769409---0.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 1.0).
25WASUIPSzh-ja2014/09/17 10:24:503830.7932300.7751680.787665---0.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: kytea, moses: 2.1.1).
26WASUIPSzh-ja2014/09/17 10:26:433840.7962200.7780750.789657---0.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 2.1.1).
27WASUIPSzh-ja2014/09/17 11:03:463870.7960590.7804020.790107---0.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
28WASUIPSzh-ja2014/09/17 12:00:463880.7967770.7817330.791219---0.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
29Kyoto-Uzh-ja2015/07/17 09:01:424900.8093950.8037800.808692---0.0000000.0000000.0000000.000000EBMTNoWAT2015 baseline
30Kyoto-Uzh-ja2015/07/17 09:04:224910.8184450.8129100.817522---0.0000000.0000000.0000000.000000EBMTNoWAT2015 baseline with reranking
31TOSHIBAzh-ja2015/07/23 15:14:535080.8272910.8173950.825472---0.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
32TOSHIBAzh-ja2015/07/28 16:27:325250.8247400.8153880.822423---0.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
33Sensezh-ja2015/07/29 07:20:205330.7746920.7648470.772410---0.0000000.0000000.0000000.000000SMTNoBaseline-2015
34Kyoto-Uzh-ja2015/08/07 13:24:555970.8226720.8170370.822340---0.0000000.0000000.0000000.000000EBMTNoUpdated JUMAN and added one reordering feature, w/ reranking
35TOSHIBAzh-ja2015/08/17 12:11:526690.7416650.7271550.738298---0.0000000.0000000.0000000.000000RBMTYesRBMT
36EHRzh-ja2015/08/19 11:23:367200.8260030.8186200.824806---0.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering. Candidate selection by language model score.
37BJTUNLPzh-ja2015/08/25 14:55:207690.8070120.7924880.802430---0.0000000.0000000.0000000.000000SMTNo
38ORGANIZERzh-ja2015/08/25 18:58:087760.5882850.5903930.592887---0.0000000.0000000.0000000.000000OtherYesOnline A (2015)
39NAISTzh-ja2015/08/31 08:23:308340.8550890.8477460.854587---0.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
40NAISTzh-ja2015/08/31 08:26:318350.8343880.8271480.834130---0.0000000.0000000.0000000.000000SMTNoTravatar System Baseline
41Kyoto-Uzh-ja2015/08/31 22:38:228440.8197430.8145810.818794---0.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
42Kyoto-Uzh-ja2015/08/31 22:39:368450.8406810.8344510.839063---0.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
43BJTUNLPzh-ja2015/09/01 21:08:108620.8070120.7924880.802430---0.0000000.0000000.0000000.000000SMTNoa dependency-to-string model for SMT
44EHRzh-ja2015/09/02 17:00:168670.8376780.8316820.837227---0.0000000.0000000.0000000.000000SMTNoPhrase based SMT with preordering.
45EHRzh-ja2015/09/04 11:44:268680.8158420.8067260.813996---0.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with user dictionary plus SPE.
46ORGANIZERzh-ja2015/09/10 14:00:338790.8252920.8204900.825025---0.0000000.0000000.0000000.000000SMTNoTree-to-String SMT (2015)
47ORGANIZERzh-ja2015/09/10 14:30:568850.6662770.6524020.661730---0.0000000.0000000.0000000.000000OtherYesRBMT A (2015)
48ORGANIZERzh-ja2015/09/11 10:09:238900.5973550.5928410.597298---0.0000000.0000000.0000000.000000OtherYesOnline B (2015)
49ORGANIZERzh-ja2016/07/26 11:54:1410430.5898020.5893970.593361----0.0000000.0000000.000000OtherYesOnline A (2016)
50EHRzh-ja2016/07/31 17:06:5710630.8437230.8361560.841952----0.0000000.0000000.000000SMTYesLM-based merging of outputs of preordered word-based PBSMT(DL=6) and preordered character-based PBSMT(DL=6).
51NICT-2zh-ja2016/08/05 18:05:0310990.8439410.8377070.842513----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
52Kyoto-Uzh-ja2016/08/07 18:28:2311100.8202590.8146610.819963----0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
53bjtu_nlpzh-ja2016/08/12 12:50:3811380.8528180.8463010.852298----0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
54JAPIOzh-ja2016/08/19 16:44:4912080.7905530.7806370.785917----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus + rule-based posteditor
55UT-KAYzh-ja2016/08/20 07:09:5412200.8474070.8420550.848040----0.0000000.0000000.000000NMTNoAn end-to-end NMT with 512 dimensional single-layer LSTMs, UNK replacement, and domain adaptation
56UT-KAYzh-ja2016/08/20 07:12:5212210.8602140.8546900.860449----0.0000000.0000000.000000NMTNoEnsemble of our NMT models with and without domain adaptation
57Kyoto-Uzh-ja2016/08/20 22:48:1612550.8693600.8647480.869913----0.0000000.0000000.000000NMTNosrc: 200k tgt: 50k 2-layers self-ensembling
58Kyoto-Uzh-ja2016/08/20 22:50:3312560.8765310.8729040.876946----0.0000000.0000000.000000NMTNovoc: 30k ensemble of 3 independent model + reverse rescoring
59Kyoto-Uzh-ja2016/10/11 10:46:0313240.8752790.8701750.875564----0.0000000.0000000.000000NMTNovoc: 32k ensemble of 4 independent model + Chinese short unit
60ORGANIZERzh-ja2016/11/16 11:28:0013420.7190220.7171730.720095----0.0000000.0000000.000000NMTYesOnline A (2016/11/14)
61NICT-2zh-ja2017/07/26 13:58:4414770.8714380.8683590.871736----0.0000000.0000000.000000NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
62NICT-2zh-ja2017/07/26 14:08:4514810.8823560.8785800.882195----0.0000000.0000000.000000NMTNoNMT 6 Ensembles * Bi-directional Reranking
63Kyoto-Uzh-ja2017/07/29 08:02:0715770.8834570.8789640.884137----0.0000000.0000000.000000NMTNoEnsemble of 5 Shared BPE 40k
64Kyoto-Uzh-ja2017/07/31 15:27:2116430.8780080.8729440.878627----0.0000000.0000000.000000NMTNoKW replacement without KW in the test set, BPE, 6 ensemble
65Kyoto-Uzh-ja2017/08/01 14:14:4917200.8842100.8800690.884745----0.0000000.0000000.000000NMTNoEnsemble of 7 shared BPE, averaged
66ORGANIZERzh-ja2017/08/02 09:59:3317400.8808150.8755110.880368----0.0000000.0000000.000000NMTNoGoogle's "Attention Is All You Need"
67ORGANIZERzh-ja2018/08/14 11:33:0319020.8707340.8662810.870886-----0.0000000.000000NMTNoNMT with Attention
68NICT-5zh-ja2018/08/22 18:51:4420520.8844260.8794560.884782-----0.0000000.000000NMTNoMixed fine tuning by first pretraining on En-Ja ASPEC data and then continue on the En-Ja+Zh-Ja data. Transformer.
69NICT-5zh-ja2018/08/27 14:40:3521690.8861610.8829890.886367-----0.0000000.000000NMTNoCombining En-Ja corpus with Zh-Ja as a multilingual model. *ADDITIONAL ASPEC CORPUS USED*
70NICT-5zh-ja2018/09/10 14:14:0522670.8896740.8864900.889853-----0.0000000.000000NMTNoMLNMT
71TMUzh-ja2018/09/14 17:30:3323430.5700400.5623310.565795-----0.0000000.000000NMTYesUnsupervised NMT with sub-character information. Both ASPEC and JPC 4.0 data (zh-ja) were also used as monolingual data in the training.
72srcbzh-ja2019/07/25 11:37:4429170.8860690.8827120.886854-------NMTNoTransformer (Big) with relative position, layer attention, sentence-wise smooth.
73KNU_Hyundaizh-ja2019/07/27 10:30:0431790.8885300.8862480.888706-------NMTNoTransformer(base) + *Used ASPEC ja-en corpus* with relative position, bt, multi source, r2l rerank, 6-model ensemble
74srcbzh-ja2019/07/27 15:48:2432100.8952310.8918720.895663-------NMTNoTransformer(Big) with relative position, sentence-wise smooth, deep transformer, back translation, ensemble of 7 models.
75Kyoto-U+ECNUzh-ja2020/09/10 23:58:1336770.8892690.8855000.889503-------NMTNoback-translation by using ja monolingual data from ASPEC-JE; lightconv (pay less attention) single model without ensemble
76Kyoto-U+ECNUzh-ja2020/09/17 18:41:3438130.8970530.8944410.897199-------NMTYesensemble 9 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder)
77Kyoto-U+ECNUzh-ja2020/09/18 17:38:5539330.8965510.8940730.896743-------NMTNowithout out-of-domain parallel data; others same as DataID:3813

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1ORGANIZERzh-ja2016/07/26 11:54:1410430.6595400.6595400.659540----0.0000000.0000000.000000OtherYesOnline A (2016)
2EHRzh-ja2016/07/31 17:06:5710630.7694900.7694900.769490----0.0000000.0000000.000000SMTYesLM-based merging of outputs of preordered word-based PBSMT(DL=6) and preordered character-based PBSMT(DL=6).
3NICT-2zh-ja2016/08/05 18:05:0310990.7685800.7685800.768580----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
4Kyoto-Uzh-ja2016/08/07 18:28:2311100.7671200.7671200.767120----0.0000000.0000000.000000EBMTNoKyotoEBMT 2016 w/o reranking
5bjtu_nlpzh-ja2016/08/12 12:50:3811380.7608400.7608400.760840----0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
6JAPIOzh-ja2016/08/19 16:44:4912080.6967700.6967700.696770----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus + rule-based posteditor
7UT-KAYzh-ja2016/08/20 07:09:5412200.7538200.7538200.753820----0.0000000.0000000.000000NMTNoAn end-to-end NMT with 512 dimensional single-layer LSTMs, UNK replacement, and domain adaptation
8UT-KAYzh-ja2016/08/20 07:12:5212210.7655300.7655300.765530----0.0000000.0000000.000000NMTNoEnsemble of our NMT models with and without domain adaptation
9Kyoto-Uzh-ja2016/08/20 22:48:1612550.7843800.7843800.784380----0.0000000.0000000.000000NMTNosrc: 200k tgt: 50k 2-layers self-ensembling
10Kyoto-Uzh-ja2016/08/20 22:50:3312560.7859100.7859100.785910----0.0000000.0000000.000000NMTNovoc: 30k ensemble of 3 independent model + reverse rescoring
11Kyoto-Uzh-ja2016/10/11 10:46:0313240.7879300.7879300.787930----0.0000000.0000000.000000NMTNovoc: 32k ensemble of 4 independent model + Chinese short unit
12ORGANIZERzh-ja2016/11/16 11:28:0013420.6928200.6928200.692820----0.0000000.0000000.000000NMTYesOnline A (2016/11/14)
13NICT-2zh-ja2017/07/26 13:58:4414770.7889400.7889400.788940----0.0000000.0000000.000000NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder
14NICT-2zh-ja2017/07/26 14:08:4514810.7996800.7996800.799680----0.0000000.0000000.000000NMTNoNMT 6 Ensembles * Bi-directional Reranking
15Kyoto-Uzh-ja2017/07/29 08:02:0715770.7995200.7995200.799520----0.0000000.0000000.000000NMTNoEnsemble of 5 Shared BPE 40k
16Kyoto-Uzh-ja2017/07/31 15:27:2116430.7934100.7934100.793410----0.0000000.0000000.000000NMTNoKW replacement without KW in the test set, BPE, 6 ensemble
17Kyoto-Uzh-ja2017/08/01 14:14:4917200.7998400.7998400.799840----0.0000000.0000000.000000NMTNoEnsemble of 7 shared BPE, averaged
18ORGANIZERzh-ja2017/08/02 09:59:3317400.7981100.7981100.798110----0.0000000.0000000.000000NMTNoGoogle's "Attention Is All You Need"
19ORGANIZERzh-ja2018/08/14 11:33:0319020.7821000.7821000.782100-----0.0000000.000000NMTNoNMT with Attention
20NICT-5zh-ja2018/08/22 18:51:4420520.8006700.8006700.800670-----0.0000000.000000NMTNoMixed fine tuning by first pretraining on En-Ja ASPEC data and then continue on the En-Ja+Zh-Ja data. Transformer.
21NICT-5zh-ja2018/08/27 14:40:3521690.8057500.8057500.805750-----0.0000000.000000NMTNoCombining En-Ja corpus with Zh-Ja as a multilingual model. *ADDITIONAL ASPEC CORPUS USED*
22NICT-5zh-ja2018/09/10 14:14:0522670.8049200.8049200.804920-----0.0000000.000000NMTNoMLNMT
23TMUzh-ja2018/09/14 17:30:3323430.5124300.5124300.512430-----0.0000000.000000NMTYesUnsupervised NMT with sub-character information. Both ASPEC and JPC 4.0 data (zh-ja) were also used as monolingual data in the training.
24srcbzh-ja2019/07/25 11:37:4429170.8124500.8124500.812450-------NMTNoTransformer (Big) with relative position, layer attention, sentence-wise smooth.
25KNU_Hyundaizh-ja2019/07/27 10:30:0431790.8099900.8099900.809990-------NMTNoTransformer(base) + *Used ASPEC ja-en corpus* with relative position, bt, multi source, r2l rerank, 6-model ensemble
26srcbzh-ja2019/07/27 15:48:2432100.8190200.8190200.819020-------NMTNoTransformer(Big) with relative position, sentence-wise smooth, deep transformer, back translation, ensemble of 7 models.
27Kyoto-U+ECNUzh-ja2020/09/10 23:58:1336770.8176100.8176100.817610-------NMTNoback-translation by using ja monolingual data from ASPEC-JE; lightconv (pay less attention) single model without ensemble
28Kyoto-U+ECNUzh-ja2020/09/17 18:41:3438130.8233900.8233900.823390-------NMTYesensemble 9 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder)
29Kyoto-U+ECNUzh-ja2020/09/18 17:38:5539330.8216600.8216600.821660-------NMTNowithout out-of-domain parallel data; others same as DataID:3813
30ORGANIZERzh-ja2014/07/11 19:47:2740.7509500.7509500.7509500.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT (2014)
31ORGANIZERzh-ja2014/07/11 19:54:5880.7530100.7530100.7530100.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoPhrase-based SMT
32ORGANIZERzh-ja2014/07/11 20:04:10130.7548700.7548700.7548700.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTree-to-String SMT (2014)
33ORGANIZERzh-ja2014/07/18 11:09:12360.6580600.6580600.6580600.0000000.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline A (2014)
34NAISTzh-ja2014/07/31 11:42:311200.7681900.7681900.7681900.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System
35NAISTzh-ja2014/08/01 17:33:011240.7662700.7662700.7662700.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
36Kyoto-Uzh-ja2014/08/19 09:31:081330.7503100.7503100.7503100.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoUsing n-best parses and RNNLM.
37Kyoto-Uzh-ja2014/08/19 10:21:371350.7482000.7482000.7482000.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoOur baseline system.
38EIWAzh-ja2014/08/20 11:52:451370.6137300.6137300.6137300.0000000.0000000.0000000.0000000.0000000.0000000.000000RBMTYesRBMT plus user dictionary
39EIWAzh-ja2014/08/20 11:56:001380.6933300.6933300.6933300.0000000.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with user dictionary plus SPE(statistical post editing)
40Sensezh-ja2014/08/26 15:17:492000.7528900.7528900.7528900.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoCharacter based SMT
41ORGANIZERzh-ja2014/08/28 12:10:132150.6369300.6369300.6369300.0000000.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline B (2014)
42SAS_MTzh-ja2014/08/29 15:33:072320.7521700.7521700.7521700.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoSyntactic reordering phrase-based SMT (SAS token tool)
43ORGANIZERzh-ja2014/08/29 18:45:032390.6260700.6260700.6260700.0000000.0000000.0000000.0000000.0000000.0000000.000000RBMTNoRBMT A (2014)
44ORGANIZERzh-ja2014/08/29 18:48:292420.5867900.5867900.5867900.0000000.0000000.0000000.0000000.0000000.0000000.000000RBMTNoRBMT D
45Kyoto-Uzh-ja2014/08/31 23:42:412580.7503700.7503700.7503700.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications.
46SAS_MTzh-ja2014/09/01 10:38:132630.7657300.7657300.7657300.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoSyntactic reordering Hierarchical SMT (using SAS token tool)
47Kyoto-Uzh-ja2014/09/01 21:33:232680.7576100.7576100.7576100.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
48WASUIPSzh-ja2014/09/17 00:43:383690.7116500.7116500.7116500.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 1.0).
49WASUIPSzh-ja2014/09/17 00:46:073700.7346200.7346200.7346200.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 1.0).
50WASUIPSzh-ja2014/09/17 01:03:573740.7406500.7406500.7406500.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: urheen and mecab, moses: 2.1.1).
51WASUIPSzh-ja2014/09/17 01:05:383750.7406400.7406400.7406400.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: urheen and mecab, moses: 2.1.1).
52WASUIPSzh-ja2014/09/17 10:07:443790.7253600.7253600.7253600.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: kytea, moses: 1.0).
53WASUIPSzh-ja2014/09/17 10:10:473800.7252500.7252500.7252500.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 1.0).
54WASUIPSzh-ja2014/09/17 10:24:503830.7537500.7537500.7537500.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: kytea, moses: 2.1.1).
55WASUIPSzh-ja2014/09/17 10:26:433840.7536900.7536900.7536900.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: kytea, moses: 2.1.1).
56WASUIPSzh-ja2014/09/17 11:03:463870.7431400.7431400.7431400.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoOur baseline system (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
57WASUIPSzh-ja2014/09/17 12:00:463880.7440400.7440400.7440400.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTYesOur baseline system + additional quasi-parallel corpus (segmentation tools: stanford-ctb and juman, moses: 2.1.1).
58Kyoto-Uzh-ja2015/07/17 09:01:424900.7570700.7570700.7570700.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoWAT2015 baseline
59Kyoto-Uzh-ja2015/07/17 09:04:224910.7621800.7621800.7621800.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoWAT2015 baseline with reranking
60TOSHIBAzh-ja2015/07/23 15:14:535080.7528300.7528300.7528300.0000000.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
61TOSHIBAzh-ja2015/07/28 16:27:325250.7581100.7581100.7581100.0000000.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
62Sensezh-ja2015/07/29 07:20:205330.7331900.7331900.7331900.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoBaseline-2015
63Kyoto-Uzh-ja2015/08/07 13:24:555970.7624300.7624300.7624300.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoUpdated JUMAN and added one reordering feature, w/ reranking
64TOSHIBAzh-ja2015/08/17 12:11:526690.6540800.6540800.6540800.0000000.0000000.0000000.0000000.0000000.0000000.000000RBMTYesRBMT
65EHRzh-ja2015/08/19 11:23:367200.7650500.7650500.7650500.0000000.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesSystem combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering. Candidate selection by language model score.
66BJTUNLPzh-ja2015/08/25 14:55:207690.7441300.7441300.7441300.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNo
67ORGANIZERzh-ja2015/08/25 18:58:087760.6498600.6498600.6498600.0000000.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline A (2015)
68NAISTzh-ja2015/08/31 08:23:308340.7710100.7710100.7710100.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System with NeuralMT Reranking
69NAISTzh-ja2015/08/31 08:26:318350.7648300.7648300.7648300.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTravatar System Baseline
70Kyoto-Uzh-ja2015/08/31 22:38:228440.7619600.7619600.7619600.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system without reranking
71Kyoto-Uzh-ja2015/08/31 22:39:368450.7697000.7697000.7697000.0000000.0000000.0000000.0000000.0000000.0000000.000000EBMTNoKyotoEBMT system with bilingual RNNLM reranking
72BJTUNLPzh-ja2015/09/01 21:08:108620.7441300.7441300.7441300.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoa dependency-to-string model for SMT
73EHRzh-ja2015/09/02 17:00:168670.7073100.7073100.7073100.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoPhrase based SMT with preordering.
74EHRzh-ja2015/09/04 11:44:268680.7541800.7541800.7541800.0000000.0000000.0000000.0000000.0000000.0000000.000000SMT and RBMTYesRBMT with user dictionary plus SPE.
75ORGANIZERzh-ja2015/09/10 14:00:338790.7548700.7548700.7548700.0000000.0000000.0000000.0000000.0000000.0000000.000000SMTNoTree-to-String SMT (2015)
76ORGANIZERzh-ja2015/09/10 14:30:568850.6260700.6260700.6260700.0000000.0000000.0000000.0000000.0000000.0000000.000000OtherYesRBMT A (2015)
77ORGANIZERzh-ja2015/09/11 10:09:238900.6282900.6282900.6282900.0000000.0000000.0000000.0000000.0000000.0000000.000000OtherYesOnline B (2015)

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-U+ECNUzh-ja2020/09/17 18:41:3438134.210NMTYesensemble 9 models: structures(LSTM, Transformer, ConvS2S, Lightconv), training data(BT, out-of-domain parallel), S2S settings(deeper transformer, deep encoder shallow decoder)
2Kyoto-U+ECNUzh-ja2020/09/18 17:38:5539334.200NMTNowithout out-of-domain parallel data; others same as DataID:3813

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1KNU_Hyundaizh-ja2019/07/27 10:30:043179UnderwayNMTNoTransformer(base) + *Used ASPEC ja-en corpus* with relative position, bt, multi source, r2l rerank, 6-model ensemble
2srcbzh-ja2019/07/27 15:48:243210UnderwayNMTNoTransformer(Big) with relative position, sentence-wise smooth, deep transformer, back translation, ensemble of 7 models.

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NICT-5zh-ja2018/09/10 14:14:05226722.750NMTNoMLNMT
2NICT-5zh-ja2018/08/22 18:51:44205211.000NMTNoMixed fine tuning by first pretraining on En-Ja ASPEC data and then continue on the En-Ja+Zh-Ja data. Transformer.

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-Uzh-ja2017/08/01 14:14:49172082.750NMTNoEnsemble of 7 shared BPE, averaged
2Kyoto-Uzh-ja2017/07/29 08:02:07157779.500NMTNoEnsemble of 5 Shared BPE 40k
3NICT-2zh-ja2017/07/26 14:08:45148179.000NMTNoNMT 6 Ensembles * Bi-directional Reranking
4ORGANIZERzh-ja2017/08/02 09:59:33174078.500NMTNoGoogle's "Attention Is All You Need"
5NICT-2zh-ja2017/07/26 13:58:44147778.000NMTNoNMT Single Model: BPE50k, Bi-LSTM(500*2) Encoder, LSTM(1000) Left-to-Right Decoder

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1Kyoto-Uzh-ja2016/08/20 22:50:33125663.750NMTNovoc: 30k ensemble of 3 independent model + reverse rescoring
2Kyoto-Uzh-ja2016/08/20 22:48:16125556.000NMTNosrc: 200k tgt: 50k 2-layers self-ensembling
3bjtu_nlpzh-ja2016/08/12 12:50:38113849.000NMTNoRNN Encoder-Decoder with attention mechanism, single model
4UT-KAYzh-ja2016/08/20 07:12:52122147.250NMTNoEnsemble of our NMT models with and without domain adaptation
5UT-KAYzh-ja2016/08/20 07:09:54122041.000NMTNoAn end-to-end NMT with 512 dimensional single-layer LSTMs, UNK replacement, and domain adaptation
6NICT-2zh-ja2016/08/05 18:05:03109936.500SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
7EHRzh-ja2016/07/31 17:06:57106332.500SMTYesLM-based merging of outputs of preordered word-based PBSMT(DL=6) and preordered character-based PBSMT(DL=6).
8ORGANIZERzh-ja2016/11/16 11:28:00134222.500NMTYesOnline A (2016/11/14)
9JAPIOzh-ja2016/08/19 16:44:49120816.500SMTYesPhrase-based SMT with Preordering + JAPIO corpus + rule-based posteditor
10ORGANIZERzh-ja2016/07/26 11:54:141043-51.250OtherYesOnline A (2016)

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NAISTzh-ja2015/08/31 08:23:3083435.750SMTNoTravatar System with NeuralMT Reranking
2EHRzh-ja2015/08/19 11:23:3672025.750SMT and RBMTYesSystem combination of RBMT with user dictionary plus SPE and phrase based SMT with preordering. Candidate selection by language model score.
3NAISTzh-ja2015/08/31 08:26:3183525.750SMTNoTravatar System Baseline
4Kyoto-Uzh-ja2015/08/31 22:39:3684518.500EBMTNoKyotoEBMT system with bilingual RNNLM reranking
5TOSHIBAzh-ja2015/07/23 15:14:5350818.000SMT and RBMTYesSystem combination SMT and RBMT(SPE) with RNNLM language model
6ORGANIZERzh-ja2015/09/10 14:00:3387917.250SMTNoTree-to-String SMT (2015)
7Kyoto-Uzh-ja2015/08/31 22:38:2284416.750EBMTNoKyotoEBMT system without reranking
8BJTUNLPzh-ja2015/09/01 21:08:108626.500SMTNoa dependency-to-string model for SMT
9TOSHIBAzh-ja2015/07/28 16:27:32525-1.000SMT and RBMTYesRBMT with SPE(Statistical Post Editing) system
10ORGANIZERzh-ja2015/08/25 18:58:08776-19.000OtherYesOnline A (2015)
11ORGANIZERzh-ja2015/09/10 14:30:56885-28.000OtherYesRBMT A (2015)

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1NAISTzh-ja2014/07/31 11:42:3112050.750SMTNoTravatar-based Forest-to-String SMT System
2NAISTzh-ja2014/08/01 17:33:0112438.000SMTNoTravatar-based Forest-to-String SMT System (Tuned BLEU+RIBES)
3SAS_MTzh-ja2014/09/01 10:38:1326322.500SMTNoSyntactic reordering Hierarchical SMT (using SAS token tool)
4ORGANIZERzh-ja2014/07/11 20:04:101316.000SMTNoTree-to-String SMT (2014)
5EIWAzh-ja2014/08/20 11:56:0013815.000SMT and RBMTYesRBMT with user dictionary plus SPE(statistical post editing)
6Kyoto-Uzh-ja2014/09/01 21:33:232687.500EBMTNoOur new baseline system after several modifications + 20-best parses, KN7, RNNLM reranking
7Kyoto-Uzh-ja2014/08/31 23:42:412586.000EBMTNoOur new baseline system after several modifications.
8ORGANIZERzh-ja2014/07/11 19:47:2744.750SMTNoHierarchical Phrase-based SMT (2014)
9Sensezh-ja2014/08/26 15:17:49200-1.000SMTNoCharacter based SMT
10ORGANIZERzh-ja2014/07/18 11:09:1236-21.750OtherYesOnline A (2014)
11ORGANIZERzh-ja2014/08/29 18:45:03239-37.750RBMTNoRBMT A (2014)

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02