Steps to Excellence: Simple Inference with Refined Scoring of Dependency Trees
Author(s)Zhang, Yuan; Lei, Tao; Barzilay, Regina; Jaakkola, Tommi S.; Globerson, Amir
MetadataShow full item record
Much of the recent work on dependency parsing has been focused on solving inherent combinatorial problems associated with rich scoring functions. In contrast, we demonstrate that highly expressive scoring functions can be used with substantially simpler inference procedures. Specifically, we introduce a sampling-based parser that can easily handle arbitrary global features. Inspired by SampleRank, we learn to take guided stochastic steps towards a high scoring parse. We introduce two samplers for traversing the space of trees, Gibbs and Metropolis-Hastings with Random Walk. The model outperforms state-of-the-art results when evaluated on 14 languages of non-projective CoNLL datasets. Our sampling-based approach naturally extends to joint prediction scenarios, such as joint parsing and POS correction. The resulting method outperforms the best reported results on the CATiB dataset, approaching performance of parsing with gold tags.
DepartmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics
Association for Computational Linguistics
Zhang, Yuan, Tao Lei, Regina Barzilay, Tommi Jaakkola, and Amir Globerson. "Steps to Excellence: Simple Inference with Refined Scoring of Dependency Trees." 52nd Annual Meeting of the Association for Computational Linguistics (June 2014).
Author's final manuscript