Show simple item record

dc.contributor.advisorOliva, Aude
dc.contributor.authorLahner, Benjamin
dc.date.accessioned2022-08-29T16:00:48Z
dc.date.available2022-08-29T16:00:48Z
dc.date.issued2022-05
dc.date.submitted2022-06-21T19:25:39.641Z
dc.identifier.urihttps://hdl.handle.net/1721.1/144631
dc.description.abstractA visual event, such as a dog running in a park, communicates complex relationships between objects and their environment. The human visual system is tasked with transforming these spatiotemporal events into meaningful outputs so we can effectively interact with our environment. To form a useful representation of the event, the visual system utilizes many visual processes, from object recognition to motion perception. Thus, studying the neural correlates of visual event understanding requires brain responses that capture the entire transformation from video-based stimuli to high-level conceptual understanding. However, despite its ecological importance and computational richness, there does not yet exist a dataset to sufficiently study visual event understanding. Here we release the Algonauts Action Videos (AAV) dataset composed of high quality functional magnetic resonance imaging brain responses to 1,102 richly annotated naturalistic video stimuli. We detail AAV’s experimental design and highlight its high quality and reliable activation throughout the visual and parietal cortices. Initial analyses show the signal contained in AAV reflects numerous visual processes representing different aspects of visual event understanding, from scene recognition to action recognition to memorability processing. Since AAV captures an ecologically-relevant and complex visual process, this dataset can be used to study how various aspects of visual perception integrate to form a meaningful understanding of a video. Additionally, we demonstrate its utility as a model evaluation benchmark to bridge the gap between visual neuroscience and video-based computer vision research.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright retained by author(s)
dc.rights.urihttps://rightsstatements.org/page/InC-EDU/1.0/
dc.titleAn fMRI dataset of 1,102 natural videos for visual event understanding
dc.typeThesis
dc.description.degreeS.M.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.identifier.orcid0000-0002-1821-490X
mit.thesis.degreeMaster
thesis.degree.nameMaster of Science in Electrical Engineering and Computer Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record