Steps to Excellence: Simple Inference with Refined Scoring of Dependency Trees
Author(s)
Zhang, Yuan; Lei, Tao; Barzilay, Regina; Jaakkola, Tommi S.; Globerson, Amir
DownloadBarzilay_Steps to.pdf (447.7Kb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Much of the recent work on dependency parsing has been focused on solving inherent combinatorial problems associated with rich scoring functions. In contrast, we demonstrate that highly expressive scoring functions can be used with substantially simpler inference procedures. Specifically, we introduce a sampling-based parser that can easily handle arbitrary global features. Inspired by SampleRank, we learn to take guided stochastic steps towards a high scoring parse. We introduce two samplers for traversing the space of trees, Gibbs and Metropolis-Hastings with Random Walk. The model outperforms state-of-the-art results when evaluated on 14 languages of non-projective CoNLL datasets. Our sampling-based approach naturally extends to joint prediction scenarios, such as joint parsing and POS correction. The resulting method outperforms the best reported results on the CATiB dataset, approaching performance of parsing with gold tags.
Date issued
2014-06Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics
Publisher
Association for Computational Linguistics
Citation
Zhang, Yuan, Tao Lei, Regina Barzilay, Tommi Jaakkola, and Amir Globerson. "Steps to Excellence: Simple Inference with Refined Scoring of Dependency Trees." 52nd Annual Meeting of the Association for Computational Linguistics (June 2014).
Version: Author's final manuscript