Dynamic clustering via asymptotics of the dependent Dirichlet process mixture
Author(s)
Campbell, Trevor David; Liu, Miao; Kulis, Brian; How, Jonathan P.; Carin, Lawrence
DownloadLow_dynamic clustering.pdf (2.811Mb)
PUBLISHER_POLICY
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
This paper presents a novel algorithm, based upon the dependent Dirichlet process mixture model (DDPMM), for clustering batch-sequential data containing an unknown number of evolving clusters. The algorithm is derived via a low-variance asymptotic analysis of the Gibbs sampling algorithm for the DDPMM, and provides a hard clustering with convergence guarantees similar to those of the k-means algorithm. Empirical results from a synthetic test with moving Gaussian clusters and a test with real ADS-B aircraft trajectory data demonstrate that the algorithm requires orders of magnitude less computational time than contemporary probabilistic and hard clustering algorithms, while providing higher accuracy on the examined datasets.
Date issued
2013Department
Massachusetts Institute of Technology. Department of Aeronautics and AstronauticsJournal
Advances in Neural Information Processing Systems (NIPS) 26
Publisher
Neural Information Processing Systems Foundation
Citation
Campbell, Trevor, Miao Liu, Brian Kulis, Jonathan P. How, and Lawrence Carin. "Dynamic clustering via asymptotics of the dependent Dirichlet process mixture." Advances in Neural Information Processing Systems (NIPS) 26, 2013.
Version: Final published version
ISSN
1049-5258