Automatically Extracting Challenge Sets for Non-Local Phenomena in Neural Machine Translation

Leshem Choshen, Omri Abend


Abstract
We show that the state-of-the-art Transformer MT model is not biased towards monotonic reordering (unlike previous recurrent neural network models), but that nevertheless, long-distance dependencies remain a challenge for the model. Since most dependencies are short-distance, common evaluation metrics will be little influenced by how well systems perform on them. We therefore propose an automatic approach for extracting challenge sets rich with long-distance dependencies, and argue that evaluation using this methodology provides a complementary perspective on system performance. To support our claim, we compile challenge sets for English-German and German-English, which are much larger than any previously released challenge set for MT. The extracted sets are large enough to allow reliable automatic evaluation, which makes the proposed approach a scalable and practical solution for evaluating MT performance on the long-tail of syntactic phenomena.
Anthology ID:
K19-1028
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
291–303
Language:
URL:
https://aclanthology.org/K19-1028/
DOI:
10.18653/v1/K19-1028
Bibkey:
Cite (ACL):
Leshem Choshen and Omri Abend. 2019. Automatically Extracting Challenge Sets for Non-Local Phenomena in Neural Machine Translation. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 291–303, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Automatically Extracting Challenge Sets for Non-Local Phenomena in Neural Machine Translation (Choshen & Abend, CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1028.pdf
Supplementary material:
 K19-1028.Supplementary_Material.pdf

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy