JST_LOGO.JPG NICT_LOGO.JPG KYOTO-U_LOGO.JPG

WAT 2017

Small-NMT Task

INTRODUCTION

The goal of this task is to build a small neural machine translation system while keeping a reasonable translation quality. There is a high demand in industries to equip smart devices with translation capabilities. Though neural machine translation reaches the point that such capability is not a dream anymore, it usually needs huge resources which are not available on daily devices. The current solution is to run a translation engine on powerful servers and to arrange the device talk to them over Internet. However reliable low-latency connection is not available in the most part of the world and will not in a short term. If we can build a small system while keeping the translation capability reasonably, it has a huge impact in the application of machine translation.

Unfortunately almost all research work of neural machine translation is biased toward improving quality with little consideration to computing resource at inference time. We hope this shared task provides a common language and asset to the NLP community to open a new research field, which will have a huge impact in cross-language communication of our society.

The participants are given pre-processed Japanese-English parallel data and requested to build a neural machine translation. The participants are required to report the followings in the system description paper:

The participants are strongly encouraged to make the system as small as possible. Especially exploration over various setups including extreme one is highly recommended.

The participants can additionaly use the other dataset other than that is provided by the organizers although using the provided dataset is mandatory.

DATA

Data for the Small-NMT Task can be downloaded from here (164MB, tar.bz2 file). This data is created from ASPEC.

Detail of the Data:

Please cite the following paper when you use this data.

@inproceedings{nakazawa-etal-2017-overview,
    title = "Overview of the 4th Workshop on {A}sian Translation",
    author = "Nakazawa, Toshiaki  and
      Higashiyama, Shohei  and
      Ding, Chenchen  and
      Mino, Hideya  and
      Goto, Isao  and
      Kazawa, Hideto  and
      Oda, Yusuke  and
      Neubig, Graham  and
      Kurohashi, Sadao",
    booktitle = "Proceedings of the 4th Workshop on {A}sian Translation ({WAT}2017)",
    month = nov,
    year = "2017",
    address = "Taipei, Taiwan",
    publisher = "Asian Federation of Natural Language Processing",
    url = "https://www.aclweb.org/anthology/W17-5701",
    pages = "1--54",
}
    

BASELINE SETTINGS

You can download the config file for nmtkit and training logs of En -> Ja model here.

REGISTRATION

There is no registration needed. Just download the data and try to minimize the NMT models. Please describe your method and report the results as the system description paper.

DEADLINE

The deadline of the small NMT task is "September 5, 2017", which is the same to the deadline of the system description paper.

CONTACT

For questions, comments, etc. please email to "wat -at- nlp -dot- ist -dot- i -dot- kyoto -hyphen- u -dot- ac -dot- jp".

Back to WAT2017 page

JST (Japan Science and Technology Agency)
NICT (National Institute of Information and Communications Technology)
Kyoto University
Last Modified: 2017-08-02