Show simple item record

dc.contributor.authorNayebi, Aran
dc.contributor.authorSagastuy-Brena, Javier
dc.contributor.authorBear, Daniel M
dc.contributor.authorKar, Kohitij
dc.contributor.authorKubilius, Jonas
dc.contributor.authorGanguli, Surya
dc.contributor.authorSussillo, David
dc.contributor.authorDiCarlo, James J
dc.contributor.authorYamins, Daniel LK
dc.date.accessioned2023-03-27T11:36:02Z
dc.date.available2023-03-27T11:36:02Z
dc.date.issued2022
dc.identifier.urihttps://hdl.handle.net/1721.1/148761
dc.description.abstract<jats:title>Abstract</jats:title> <jats:p>The computational role of the abundant feedback connections in the ventral visual stream is unclear, enabling humans and nonhuman primates to effortlessly recognize objects across a multitude of viewing conditions. Prior studies have augmented feedforward convolutional neural networks (CNNs) with recurrent connections to study their role in visual processing; however, often these recurrent networks are optimized directly on neural data or the comparative metrics used are undefined for standard feedforward networks that lack these connections. In this work, we develop task-optimized convolutional recurrent (ConvRNN) network models that more correctly mimic the timing and gross neuroanatomy of the ventral pathway. Properly chosen intermediate-depth ConvRNN circuit architectures, which incorporate mechanisms of feedforward bypassing and recurrent gating, can achieve high performance on a core recognition task, comparable to that of much deeper feedforward networks. We then develop methods that allow us to compare both CNNs and ConvRNNs to finely grained measurements of primate categorization behavior and neural response trajectories across thousands of stimuli. We find that high-performing ConvRNNs provide a better match to these data than feedforward networks of any depth, predicting the precise timings at which each stimulus is behaviorally decoded from neural activation patterns. Moreover, these ConvRNN circuits consistently produce quantitatively accurate predictions of neural dynamics from V4 and IT across the entire stimulus presentation. In fact, we find that the highest-performing ConvRNNs, which best match neural and behavioral data, also achieve a strong Pareto trade-off between task performance and overall network size. Taken together, our results suggest the functional purpose of recurrence in the ventral pathway is to fit a high-performing network in cortex, attaining computational power through temporal rather than spatial complexity.</jats:p>en_US
dc.language.isoen
dc.publisherMIT Pressen_US
dc.relation.isversionof10.1162/NECO_A_01506en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceMIT Pressen_US
dc.titleRecurrent Connections in the Primate Ventral Visual Stream Mediate a Trade-Off Between Task Performance and Network Size During Core Object Recognitionen_US
dc.typeArticleen_US
dc.identifier.citationNayebi, Aran, Sagastuy-Brena, Javier, Bear, Daniel M, Kar, Kohitij, Kubilius, Jonas et al. 2022. "Recurrent Connections in the Primate Ventral Visual Stream Mediate a Trade-Off Between Task Performance and Network Size During Core Object Recognition." Neural Computation, 34 (8).
dc.contributor.departmentMassachusetts Institute of Technology. Department of Brain and Cognitive Sciencesen_US
dc.relation.journalNeural Computationen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2023-03-24T17:44:06Z
dspace.orderedauthorsNayebi, A; Sagastuy-Brena, J; Bear, DM; Kar, K; Kubilius, J; Ganguli, S; Sussillo, D; DiCarlo, JJ; Yamins, DLKen_US
dspace.date.submission2023-03-24T17:44:08Z
mit.journal.volume34en_US
mit.journal.issue8en_US
mit.licensePUBLISHER_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record