Advances in Hierarchical Probabilistic Multimodal Data Fusion
Author(s)
Dean, Christopher L.
DownloadThesis PDF (19.32Mb)
Advisor
Fisher III, John W.
Terms of use
Metadata
Show full item recordAbstract
Multimodal data fusion is the process of integrating disparate data sources into a shared representation suitable for complex reasoning. As a result, one can make more precise inferences about the underlying phenomenon than is possible with each data source used in isolation. In the thesis we adopt a Bayesian view of multimodal data fusion, which formulates reasoning as posterior inference over latent variables. Within the Bayesian setting we present a novel method for data integration that we call lightweight data fusion (LDF). LDF addresses the case where the forward model for a subset of the data sources is unknown or poorly characterized. LDF leverages the remaining data sources to learn an inverse model suitable for posterior inference that combines both types of data. Additionally, we develop a multimodal extension to hierarchical Dirichlet processes (mmHDPs) where, in contrast to the setting for LDF, we lack observation-level correspondences across modalities and the data arise from an implicit latent variable model. Finally, we develop a novel representation for Dirichlet process and HDP mixture models that enables parallelization during inference and extends to more complex models including mmHDPs.
Date issued
2022-05Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology