dc.contributor.advisor | Leslie P. Kaelbling. | en_US |
dc.contributor.author | Kim, Hyun Soo, M. Eng. Massachusetts Institute of Technology | en_US |
dc.contributor.other | Massachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science. | en_US |
dc.date.accessioned | 2011-02-23T14:42:05Z | |
dc.date.available | 2011-02-23T14:42:05Z | |
dc.date.copyright | 2010 | en_US |
dc.date.issued | 2010 | en_US |
dc.identifier.uri | http://hdl.handle.net/1721.1/61287 | |
dc.description | Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010. | en_US |
dc.description | Cataloged from PDF version of thesis. | en_US |
dc.description | Includes bibliographical references (p. 99-100). | en_US |
dc.description.abstract | Hidden Markov Models (HMMs) are ubiquitously used in applications such as speech recognition and gene prediction that involve inferring latent variables given observations. For the past few decades, the predominant technique used to infer these hidden variables has been the Baum-Welch algorithm. This thesis utilizes insights from two related fields. The first insight is from Angluin's seminal paper on learning regular sets from queries and counterexamples, which produces a simple and intuitive algorithm that efficiently learns deterministic finite automata. The second insight follows from a careful analysis of the representation of HMMs as matrices and realizing that matrices hold deeper meaning than simply entities used to represent the HMMs. This thesis takes Angluin's approach and nonnegative matrix factorization and applies them to learning HMMs. Angluin's approach fails and the reasons are discussed. The matrix factorization approach is successful, allowing us to produce a novel method of learning HMMs. The new method is combined with Baum-Welch into a hybrid algorithm. We evaluate the algorithm by comparing its performance in learning selected HMMs to the Baum-Welch algorithm. We empirically show that our algorithm is able to perform better than the Baum-Welch algorithm for HMMs with at most six states that have dense output and transition matrices. For these HMMs, our algorithm is shown to perform 22.65% better on average by the Kullback-Liebler measure. | en_US |
dc.description.statementofresponsibility | by Hyun Soo Kim. | en_US |
dc.format.extent | 100 p. | en_US |
dc.language.iso | eng | en_US |
dc.publisher | Massachusetts Institute of Technology | en_US |
dc.rights | M.I.T. theses are protected by
copyright. They may be viewed from this source for any purpose, but
reproduction or distribution in any format is prohibited without written
permission. See provided URL for inquiries about permission. | en_US |
dc.rights.uri | http://dspace.mit.edu/handle/1721.1/7582 | en_US |
dc.subject | Electrical Engineering and Computer Science. | en_US |
dc.title | Two new approaches for learning Hidden Markov Models | en_US |
dc.title.alternative | 2 new approaches for learning HMMs | en_US |
dc.type | Thesis | en_US |
dc.description.degree | M.Eng. | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
dc.identifier.oclc | 702644273 | en_US |