A Systematic Assessment of Syntactic Generalization in Neural Language Models
Author(s)
Hu, Jennifer; Gauthier, Jon; Qian, Peng; Levy, Roger P
DownloadPublished version (589.6Kb)
Publisher Policy
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
While state-of-the-art neural network models continue to achieve lower perplexity scores on language modeling benchmarks, it remains unknown whether optimizing for broad-coverage predictive performance leads to human-like syntactic knowledge. Furthermore, existing work has not provided a clear picture about the model properties required to produce proper syntactic generalizations. We present a systematic evaluation of the syntactic knowledge of neural language models, testing 20 combinations of model types and data sizes on a set of 34 English-language syntactic test suites. We find substantial differences in syntactic generalization performance by model architecture, with sequential models underperforming other architectures. Factorially manipulating model architecture and training dataset size (1M-40M words), we find that variability in syntactic generalization performance is substantially greater by architecture than by dataset size for the corpora tested in our experiments. Our results also reveal a dissociation between perplexity and syntactic generalization performance.
Date issued
2020-07Department
Massachusetts Institute of Technology. Department of Brain and Cognitive SciencesJournal
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Publisher
Association for Computational Linguistics (ACL)
Citation
Hu, Jennifer et al. “A Systematic Assessment of Syntactic Generalization in Neural Language Models.” Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (July 2020): 1725–1744 © 2020 The Author(s)
Version: Final published version