Generative models for neural time series with structured domain priors
Author(s)
Song, Andrew Hyungsuk
DownloadThesis PDF (6.018Mb)
Advisor
Brown, Emery N.
Terms of use
Metadata
Show full item recordAbstract
When I initially set out to research in the intersection of statistical signal processing and neuroscience (neural signal processing), my research advisor, Professor Emery N. Brown, explained at length that the signals from seemingly complex neural/biological systems are not purely random, but rather those that have latent structures that can be recovered with principled approaches. This insight has stuck with me since that moment and my research throughout graduate school has been understanding and practicing what I thought was the appropriate neural signal processing framework. In this thesis, I define this framework from the Bayesian/optimization perspective and emphasize translating and integrating the clinical and scientific domain knowledge, obtained from constant interaction/collaboration with the experimental neuroscientists and clinicians. The thesis specifically focuses on uncovering latent structures in the neural time series data, by using domain priors/constraints, such as Gaussian process, shift-invariance, sparsity, and smoothness, among many others. It is demonstrated in the thesis that the Bayesian approach with careful integration of these constraints produces results/structures in the data that are not only intepretable but also better performing for the metrics of interest.
Date issued
2022-02Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology