NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT

The Workshop on Asian Translation
Evaluation Results

[EVALUATION RESULTS TOP] | [BLEU] | [RIBES] | [AMFM] | [HUMAN (WAT2022)] | [HUMAN (WAT2021)] | [HUMAN (WAT2020)] | [HUMAN (WAT2019)] | [HUMAN (WAT2018)] | [HUMAN (WAT2017)] | [HUMAN (WAT2016)] | [HUMAN (WAT2015)] | [HUMAN (WAT2014)] | [EVALUATION RESULTS USAGE POLICY]

BLEU


# Team Task Date/Time DataID BLEU
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1JAPIOJPCen-ja2017/07/25 12:13:33144555.5556.0355.40---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Japio corpus including some sentences in testset
2JAPIOJPCen-ja2017/07/26 10:16:13146251.7952.2351.75---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Japio corpus
3JAPIOJPCen-ja2016/08/17 11:39:13115750.2851.0850.53---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + JAPIO corpus including some sentences in testset
4JAPIOJPCen-ja2017/07/25 18:14:57145450.2751.2350.17---- 0.00 0.00 0.00NMTYesCombination of 4 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
5tpt_watJPCen-ja2021/04/27 02:26:24570649.3750.6449.42-------NMTNoBase Transformer model with joint vocab, size 8k
6KNU_HyundaiJPCen-ja2019/07/27 12:06:18318549.0450.4349.08-------NMTNoTransformer Base, relative position, BT, r2l reranking, checkpoint ensemble
7KNU_HyundaiJPCen-ja2019/07/27 12:51:20319348.9950.1748.96-------NMTYesTransformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble
8Bering LabJPCen-ja2021/05/04 20:52:15638748.8350.0748.77-------NMTYesTransformer Ensemble with additional crawled parallel corpus
9JAPIOJPCen-ja2017/07/25 18:10:48145348.3949.3448.29---- 0.00 0.00 0.00NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
10JAPIOJPCen-ja2016/08/17 11:36:23115647.7948.5747.92---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + JPC/JAPIO corpora
11EHRJPCen-ja2018/09/16 15:09:19247648.0349.2447.86----- 0.00 0.00NMTYesSMT reranked NMT (4M traning data from WAT and NTCIR, Epoch 20)
12EHRJPCen-ja2018/09/15 15:50:14239548.0149.1447.83----- 0.00 0.00NMTYesSMT reranked NMT (4M traning data from WAT and NTCIR, Epoch 18)
13sarahJPCen-ja2019/07/26 11:26:05297347.6749.0647.51-------NMTNoTransformer, ensemble of 4 models
14EHRJPCen-ja2018/09/13 12:50:53228447.5748.6847.40----- 0.00 0.00NMTYesSMT reranked NMT (4M traning data from WAT and NTCIR, Epoch 13)
15EHRJPCen-ja2018/09/08 20:31:56224847.3648.6647.22----- 0.00 0.00NMTNoSMT reranked NMT
16goku20JPCen-ja2020/09/22 00:08:01410746.3148.0146.21-------NMTNomBART pre-training transformer, ensemble of 3 models
17goku20JPCen-ja2020/09/21 12:14:57409346.0847.6745.93-------NMTNomBART pre-training transformer, single model
18JAPIOJPCen-ja2016/08/15 15:56:40114145.5746.4045.74---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + JAPIO corpus
19EHRJPCen-ja2017/07/19 19:08:34140744.6345.9444.53---- 0.00 0.00 0.00NMTNoSimple NMT (sub word based, by OpenNMT)
20JAPIOJPCen-ja2017/07/25 18:02:12145144.6946.0144.53---- 0.00 0.00 0.00NMTNoOpenNMT(dbrnn)
21EHRJPCen-ja2017/07/19 18:48:50140644.4445.5944.15---- 0.00 0.00 0.00NMTNoSMT reranked NMT (sub word based, by Moses and OpenNMT)
22ORGANIZERJPCen-ja2018/08/15 18:35:10196443.8445.2843.70----- 0.00 0.00NMTNoNMT with Attention
23NICT-2JPCen-ja2016/08/05 17:56:31109840.9042.5140.66---- 0.00 0.00 0.00SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
24bjtu_nlpJPCen-ja2016/08/08 12:26:18111239.4641.1639.45---- 0.00 0.00 0.00NMTNoRNN Encoder-Decoder with attention mechanism, single model
25u-tkbJPCen-ja2017/07/26 12:46:57147038.9141.1239.11---- 0.00 0.00 0.00NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
26NICT-2JPCen-ja2016/08/04 17:23:27107839.0340.7438.98---- 0.00 0.00 0.00SMTNoPhrase-based SMT with Preordering + Domain Adaptation
27JAPIOJPCen-ja2016/10/27 13:05:35133038.5940.0038.65---- 0.00 0.00 0.00SMTNoPhrase-based SMT with Preordering
28ORGANIZERJPCen-ja2016/07/26 10:20:53103636.8837.8936.83---- 0.00 0.00 0.00OtherYesOnline A (2016)
29Kyoto-UJPCen-ja2016/07/13 17:38:2498636.0438.1436.30---- 0.00 0.00 0.00EBMTNoBaseline
30ORGANIZERJPCen-ja2016/07/13 16:52:0797535.6037.6535.82---- 0.00 0.00 0.00SMTNoTree-to-String SMT
31ORGANIZERJPCen-ja2016/07/13 16:50:0397434.5736.6134.79---- 0.00 0.00 0.00SMTNoHierarchical Phrase-based SMT
32ORGANIZERJPCen-ja2016/07/13 16:47:2597332.3634.2632.52---- 0.00 0.00 0.00SMTNoPhrase-based SMT
33ORGANIZERJPCen-ja2016/08/05 14:38:28108626.6428.4826.84---- 0.00 0.00 0.00OtherYesRBMT F (2016)
34ORGANIZERJPCen-ja2016/08/05 14:36:45108523.0224.9023.45---- 0.00 0.00 0.00OtherYesRBMT D (2016)
35ORGANIZERJPCen-ja2016/08/02 10:26:14107321.5722.6221.65---- 0.00 0.00 0.00OtherYesOnline B (2016)
36ORGANIZERJPCen-ja2016/08/05 14:41:32108721.3523.1721.53---- 0.00 0.00 0.00OtherYesRBMT E (2016)

Notice:

Back to top

RIBES


# Team Task Date/Time DataID RIBES
Method
Other
Resources
System
Description
juman kytea mecab moses-
tokenizer
stanford-
segmenter-
ctb
stanford-
segmenter-
pku
indic-
tokenizer
unuse myseg kmseg
1JAPIOJPCen-ja2017/07/25 18:14:5714540.8864030.8834810.885747----0.0000000.0000000.000000NMTYesCombination of 4 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
2KNU_HyundaiJPCen-ja2019/07/27 12:51:2031930.8816510.8791630.881114-------NMTYesTransformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble
3Bering LabJPCen-ja2021/05/04 20:52:1563870.8805050.8783330.880066-------NMTYesTransformer Ensemble with additional crawled parallel corpus
4JAPIOJPCen-ja2017/07/25 18:10:4814530.8802150.8779800.879319----0.0000000.0000000.000000NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
5tpt_watJPCen-ja2021/04/27 02:26:2457060.8799170.8770400.878975-------NMTNoBase Transformer model with joint vocab, size 8k
6KNU_HyundaiJPCen-ja2019/07/27 12:06:1831850.8785670.8768310.878260-------NMTNoTransformer Base, relative position, BT, r2l reranking, checkpoint ensemble
7JAPIOJPCen-ja2017/07/25 12:13:3314450.8756670.8734860.874423----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Japio corpus including some sentences in testset
8EHRJPCen-ja2018/09/15 15:50:1423950.8734260.8703060.873007-----0.0000000.000000NMTYesSMT reranked NMT (4M traning data from WAT and NTCIR, Epoch 18)
9EHRJPCen-ja2018/09/16 15:09:1924760.8728280.8703320.872442-----0.0000000.000000NMTYesSMT reranked NMT (4M traning data from WAT and NTCIR, Epoch 20)
10sarahJPCen-ja2019/07/26 11:26:0529730.8711850.8702640.871307-------NMTNoTransformer, ensemble of 4 models
11EHRJPCen-ja2018/09/08 20:31:5622480.8714390.8685380.871104-----0.0000000.000000NMTNoSMT reranked NMT
12EHRJPCen-ja2018/09/13 12:50:5322840.8707300.8679470.870616-----0.0000000.000000NMTYesSMT reranked NMT (4M traning data from WAT and NTCIR, Epoch 13)
13goku20JPCen-ja2020/09/22 00:08:0141070.8703280.8683700.870081-------NMTNomBART pre-training transformer, ensemble of 3 models
14goku20JPCen-ja2020/09/21 12:14:5740930.8686990.8666000.868532-------NMTNomBART pre-training transformer, single model
15EHRJPCen-ja2017/07/19 19:08:3414070.8667220.8642560.866205----0.0000000.0000000.000000NMTNoSimple NMT (sub word based, by OpenNMT)
16JAPIOJPCen-ja2017/07/25 18:02:1214510.8645680.8627810.864251----0.0000000.0000000.000000NMTNoOpenNMT(dbrnn)
17JAPIOJPCen-ja2017/07/26 10:16:1314620.8640380.8615960.862200----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Japio corpus
18EHRJPCen-ja2017/07/19 18:48:5014060.8609980.8584660.860659----0.0000000.0000000.000000NMTNoSMT reranked NMT (sub word based, by Moses and OpenNMT)
19ORGANIZERJPCen-ja2018/08/15 18:35:1019640.8607020.8574220.859818-----0.0000000.000000NMTNoNMT with Attention
20JAPIOJPCen-ja2016/08/17 11:39:1311570.8599570.8576550.858750----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus including some sentences in testset
21JAPIOJPCen-ja2016/08/17 11:36:2311560.8591390.8563920.857422----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JPC/JAPIO corpora
22JAPIOJPCen-ja2016/08/15 15:56:4011410.8513760.8485800.849513----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus
23u-tkbJPCen-ja2017/07/26 12:46:5714700.8458150.8468880.845551----0.0000000.0000000.000000NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
24bjtu_nlpJPCen-ja2016/08/08 12:26:1811120.8427620.8401480.842669----0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
25JAPIOJPCen-ja2016/10/27 13:05:3513300.8391410.8358880.838096----0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering
26NICT-2JPCen-ja2016/08/05 17:56:3110980.8365560.8324010.832622----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
27NICT-2JPCen-ja2016/08/04 17:23:2710780.8262280.8235820.824428----0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering + Domain Adaptation
28Kyoto-UJPCen-ja2016/07/13 17:38:249860.8089990.8079720.809610----0.0000000.0000000.000000EBMTNoBaseline
29ORGANIZERJPCen-ja2016/07/13 16:52:079750.7973530.7967830.798025----0.0000000.0000000.000000SMTNoTree-to-String SMT
30ORGANIZERJPCen-ja2016/07/26 10:20:5310360.7981680.7924710.796308----0.0000000.0000000.000000OtherYesOnline A (2016)
31ORGANIZERJPCen-ja2016/07/13 16:50:039740.7777590.7786570.779049----0.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT
32ORGANIZERJPCen-ja2016/08/05 14:38:2810860.7736730.7692440.773344----0.0000000.0000000.000000OtherYesRBMT F (2016)
33ORGANIZERJPCen-ja2016/08/05 14:36:4510850.7612240.7573410.760325----0.0000000.0000000.000000OtherYesRBMT D (2016)
34ORGANIZERJPCen-ja2016/08/05 14:41:3210870.7434840.7419850.742300----0.0000000.0000000.000000OtherYesRBMT E (2016)
35ORGANIZERJPCen-ja2016/08/02 10:26:1410730.7430830.7352030.740962----0.0000000.0000000.000000OtherYesOnline B (2016)
36ORGANIZERJPCen-ja2016/07/13 16:47:259730.7285390.7282810.729077----0.0000000.0000000.000000SMTNoPhrase-based SMT

Notice:

Back to top

AMFM


# Team Task Date/Time DataID AMFM
Method
Other
Resources
System
Description
unuse unuse unuse unuse unuse unuse unuse unuse unuse unuse
1tpt_watJPCen-ja2021/04/27 02:26:2457060.8857620.8857620.885762-------NMTNoBase Transformer model with joint vocab, size 8k
2Bering LabJPCen-ja2021/05/04 20:52:1563870.8854350.8854350.885435-------NMTYesTransformer Ensemble with additional crawled parallel corpus
3JAPIOJPCen-ja2017/07/25 12:13:3314450.8022600.8022600.802260----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Japio corpus including some sentences in testset
4JAPIOJPCen-ja2017/07/26 10:16:1314620.7811500.7811500.781150----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Japio corpus
5JAPIOJPCen-ja2017/07/25 18:14:5714540.7767900.7767900.776790----0.0000000.0000000.000000NMTYesCombination of 4 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
6JAPIOJPCen-ja2016/08/17 11:39:1311570.7686900.7686900.768690----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus including some sentences in testset
7JAPIOJPCen-ja2017/07/25 18:10:4814530.7677200.7677200.767720----0.0000000.0000000.000000NMTYesOpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor
8JAPIOJPCen-ja2016/08/17 11:36:2311560.7628500.7628500.762850----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JPC/JAPIO corpora
9EHRJPCen-ja2018/09/15 15:50:1423950.7612000.7612000.761200-----0.0000000.000000NMTYesSMT reranked NMT (4M traning data from WAT and NTCIR, Epoch 18)
10EHRJPCen-ja2018/09/13 12:50:5322840.7591400.7591400.759140-----0.0000000.000000NMTYesSMT reranked NMT (4M traning data from WAT and NTCIR, Epoch 13)
11EHRJPCen-ja2018/09/16 15:09:1924760.7591200.7591200.759120-----0.0000000.000000NMTYesSMT reranked NMT (4M traning data from WAT and NTCIR, Epoch 20)
12EHRJPCen-ja2018/09/08 20:31:5622480.7575100.7575100.757510-----0.0000000.000000NMTNoSMT reranked NMT
13JAPIOJPCen-ja2016/08/15 15:56:4011410.7479100.7479100.747910----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + JAPIO corpus
14EHRJPCen-ja2017/07/19 19:08:3414070.7477700.7477700.747770----0.0000000.0000000.000000NMTNoSimple NMT (sub word based, by OpenNMT)
15EHRJPCen-ja2017/07/19 18:48:5014060.7470500.7470500.747050----0.0000000.0000000.000000NMTNoSMT reranked NMT (sub word based, by Moses and OpenNMT)
16JAPIOJPCen-ja2017/07/25 18:02:1214510.7467200.7467200.746720----0.0000000.0000000.000000NMTNoOpenNMT(dbrnn)
17ORGANIZERJPCen-ja2018/08/15 18:35:1019640.7442700.7442700.744270-----0.0000000.000000NMTNoNMT with Attention
18NICT-2JPCen-ja2016/08/05 17:56:3110980.7386300.7386300.738630----0.0000000.0000000.000000SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
19u-tkbJPCen-ja2017/07/26 12:46:5714700.7340100.7340100.734010----0.0000000.0000000.000000NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
20JAPIOJPCen-ja2016/10/27 13:05:3513300.7330200.7330200.733020----0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering
21NICT-2JPCen-ja2016/08/04 17:23:2710780.7255400.7255400.725540----0.0000000.0000000.000000SMTNoPhrase-based SMT with Preordering + Domain Adaptation
22bjtu_nlpJPCen-ja2016/08/08 12:26:1811120.7225600.7225600.722560----0.0000000.0000000.000000NMTNoRNN Encoder-Decoder with attention mechanism, single model
23ORGANIZERJPCen-ja2016/07/26 10:20:5310360.7191100.7191100.719110----0.0000000.0000000.000000OtherYesOnline A (2016)
24ORGANIZERJPCen-ja2016/07/13 16:52:079750.7170300.7170300.717030----0.0000000.0000000.000000SMTNoTree-to-String SMT
25Kyoto-UJPCen-ja2016/07/13 17:38:249860.7169900.7169900.716990----0.0000000.0000000.000000EBMTNoBaseline
26ORGANIZERJPCen-ja2016/07/13 16:50:039740.7153000.7153000.715300----0.0000000.0000000.000000SMTNoHierarchical Phrase-based SMT
27ORGANIZERJPCen-ja2016/07/13 16:47:259730.7119000.7119000.711900----0.0000000.0000000.000000SMTNoPhrase-based SMT
28ORGANIZERJPCen-ja2016/08/05 14:38:2810860.6754700.6754700.675470----0.0000000.0000000.000000OtherYesRBMT F (2016)
29ORGANIZERJPCen-ja2016/08/02 10:26:1410730.6599500.6599500.659950----0.0000000.0000000.000000OtherYesOnline B (2016)
30ORGANIZERJPCen-ja2016/08/05 14:36:4510850.6477300.6477300.647730----0.0000000.0000000.000000OtherYesRBMT D (2016)
31ORGANIZERJPCen-ja2016/08/05 14:41:3210870.6469300.6469300.646930----0.0000000.0000000.000000OtherYesRBMT E (2016)
32sarahJPCen-ja2019/07/26 11:26:0529730.0000000.0000000.000000-------NMTNoTransformer, ensemble of 4 models
33KNU_HyundaiJPCen-ja2019/07/27 12:06:1831850.0000000.0000000.000000-------NMTNoTransformer Base, relative position, BT, r2l reranking, checkpoint ensemble
34KNU_HyundaiJPCen-ja2019/07/27 12:51:2031930.0000000.0000000.000000-------NMTYesTransformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble
35goku20JPCen-ja2020/09/21 12:14:5740930.0000000.0000000.000000-------NMTNomBART pre-training transformer, single model
36goku20JPCen-ja2020/09/22 00:08:0141070.0000000.0000000.000000-------NMTNomBART pre-training transformer, ensemble of 3 models

Notice:

Back to top

HUMAN (WAT2022)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2021)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2020)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2019)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1sarahJPCen-ja2019/07/26 11:26:052973UnderwayNMTNoTransformer, ensemble of 4 models
2KNU_HyundaiJPCen-ja2019/07/27 12:06:183185UnderwayNMTNoTransformer Base, relative position, BT, r2l reranking, checkpoint ensemble
3KNU_HyundaiJPCen-ja2019/07/27 12:51:203193UnderwayNMTYesTransformer Base (+ASPEC data), relative position, BT, r2l reranking, checkpoint ensemble

Notice:
Back to top

HUMAN (WAT2018)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2017)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1EHRJPCen-ja2017/07/19 19:08:34140760.000NMTNoSimple NMT (sub word based, by OpenNMT)
2EHRJPCen-ja2017/07/19 18:48:50140658.250NMTNoSMT reranked NMT (sub word based, by Moses and OpenNMT)
3JAPIOJPCen-ja2017/07/25 18:14:57145456.250NMTYesCombination of 4 NMT systems (OpenNMT(dbrnn) + JPC/Japio corpora + NMT/rule-based posteditor)
4u-tkbJPCen-ja2017/07/26 12:46:57147049.500NMTNoNMT with SMT phrase translation (phrase extraction with branching entropy; attention over bidirectional LSTMs; by Harvard NMT)
5JAPIOJPCen-ja2017/07/26 10:16:13146241.000SMTYesPhrase-based SMT with Preordering + Japio corpus

Notice:
Back to top

HUMAN (WAT2016)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description
1bjtu_nlpJPCen-ja2016/08/08 12:26:18111239.500NMTNoRNN Encoder-Decoder with attention mechanism, single model
2NICT-2JPCen-ja2016/08/05 17:56:31109837.750SMTYesPhrase-based SMT with Preordering + Domain Adaptation (JPC and ASPEC) + Google 5-gram LM
3ORGANIZERJPCen-ja2016/07/13 16:52:0797530.750SMTNoTree-to-String SMT
4NICT-2JPCen-ja2016/08/04 17:23:27107830.750SMTNoPhrase-based SMT with Preordering + Domain Adaptation
5JAPIOJPCen-ja2016/08/17 11:36:23115626.750SMTYesPhrase-based SMT with Preordering + JPC/JAPIO corpora
6ORGANIZERJPCen-ja2016/07/13 16:50:0397421.000SMTNoHierarchical Phrase-based SMT
7ORGANIZERJPCen-ja2016/07/26 10:20:53103620.000OtherYesOnline A (2016)
8JAPIOJPCen-ja2016/08/15 15:56:40114117.750SMTYesPhrase-based SMT with Preordering + JAPIO corpus
9ORGANIZERJPCen-ja2016/08/05 14:38:28108612.750OtherYesRBMT F (2016)

Notice:
Back to top

HUMAN (WAT2015)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

HUMAN (WAT2014)


# Team Task Date/Time DataID HUMAN
Method
Other
Resources
System
Description

Notice:
Back to top

EVALUATION RESULTS USAGE POLICY

When you use the WAT evaluation results for any purpose such as:
- writing technical papers,
- making presentations about your system,
- advertising your MT system to the customers,
you can use the information about translation directions, scores (including both automatic and human evaluations) and ranks of your system among others. You can also use the scores of the other systems, but you MUST anonymize the other system's names. In addition, you can show the links (URLs) to the WAT evaluation result pages.

NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2018-08-02