MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • Computer Science and Artificial Intelligence Lab (CSAIL)
  • Artificial Intelligence Lab Publications
  • AI Memos (1959 - 2004)
  • View Item
  • DSpace@MIT Home
  • Computer Science and Artificial Intelligence Lab (CSAIL)
  • Artificial Intelligence Lab Publications
  • AI Memos (1959 - 2004)
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Hierarchical Mixtures of Experts and the EM Algorithm

Author(s)
Jordan, Michael I.; Jacobs, Robert A.
Thumbnail
DownloadAIM-1440.ps.Z (185.6Kb)
Additional downloads
AIM-1440.pdf (662.9Kb)
Metadata
Show full item record
Abstract
We present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM's). Learning is treated as a maximum likelihood problem; in particular, we present an Expectation-Maximization (EM) algorithm for adjusting the parameters of the architecture. We also develop an on-line learning algorithm in which the parameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain.
Date issued
1993-08-01
URI
http://hdl.handle.net/1721.1/7206
Other identifiers
AIM-1440
CBCL-083
Series/Report no.
AIM-1440CBCL-083
Keywords
supervised learning, statistics, decision trees, neuralsnetworks

Collections
  • AI Memos (1959 - 2004)
  • CBCL Memos (1993 - 2004)

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.