MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Simple, fast, and flexible framework for matrix completion with infinite width neural networks

Author(s)
Radhakrishnan, Adityanarayanan; Stefanakis, George; Belkin, Mikhail; Uhler, Caroline
Thumbnail
DownloadPublished version (2.548Mb)
Publisher with Creative Commons License

Publisher with Creative Commons License

Creative Commons Attribution

Terms of use
Creative Commons Attribution-NonCommercial-NoDerivs License http://creativecommons.org/licenses/by-nc-nd/4.0/
Metadata
Show full item record
Abstract
<jats:title>Significance</jats:title> <jats:p>Matrix completion is a fundamental problem in machine learning that arises in various applications. We envision that our infinite width neural network framework for matrix completion will be easily deployable and produce strong baselines for a wide range of applications at limited computational costs. We demonstrate the flexibility of our framework through competitive results on virtual drug screening and image inpainting/reconstruction. Simplicity and speed are showcased by the fact that most results in this work require only a central processing unit and commodity hardware. Through its connection to semisupervised learning, our framework provides a principled approach for matrix completion that can be easily applied to problems well beyond those of image completion and virtual drug screening considered in this paper.</jats:p>
Date issued
2022-04-19
URI
https://hdl.handle.net/1721.1/143919
Department
Massachusetts Institute of Technology. Laboratory for Information and Decision Systems; Massachusetts Institute of Technology. Institute for Data, Systems, and Society
Journal
Proceedings of the National Academy of Sciences
Publisher
Proceedings of the National Academy of Sciences
Citation
Radhakrishnan, Adityanarayanan, Stefanakis, George, Belkin, Mikhail and Uhler, Caroline. 2022. "Simple, fast, and flexible framework for matrix completion with infinite width neural networks." Proceedings of the National Academy of Sciences, 119 (16).
Version: Final published version

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.