Content Modeling Using Latent Permutations
Author(s)
Chen, Harr; Branavan, Satchuthanan R.; Barzilay, Regina; Karger, David R.
DownloadC4A90305d01.pdf (319.2Kb)
PUBLISHER_POLICY
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
We present a novel Bayesian topic model for learning discourse-level document structure. Our model leverages insights from discourse theory to constrain latent topic assignments in a way that reflects the underlying organization of document topics. We propose a global model in which both topic selection and ordering are biased to be similar across a collection of related documents. We show that this space of orderings can be effectively represented using a distribution over permutations called the Generalized Mallows Model. We apply our method to three complementary discourse-level tasks: cross-document alignment, document segmentation, and information ordering. Our experiments show that incorporating our permutation-based model in these applications yields substantial improvements in performance over previously proposed methods.
Date issued
2009-10Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Journal of Artificial Intelligence Research
Publisher
AI Access Foundation
Citation
H. Chen, S.R.K. Branavan, R. Barzilay and D. R. Karger (2009) "Content Modeling Using Latent Permutations", Journal of Artificial Intelligence Research. Volume 36, pages 129-163. © 2009 AI Access Foundation.
Version: Final published version
ISSN
1076-9757