Show simple item record

dc.contributor.authorGao, Zhengqi
dc.contributor.authorSun, Fan-keng
dc.contributor.authorRohrer, Ron
dc.contributor.authorBoning, Duane
dc.date.accessioned2025-08-06T16:18:36Z
dc.date.available2025-08-06T16:18:36Z
dc.date.issued2024-10-09
dc.date.submitted2025-04-09
dc.identifier.isbn979-8-4007-1077-3
dc.identifier.urihttps://hdl.handle.net/1721.1/162215
dc.descriptionICCAD ’24, October 27–31, 2024, New York, NY, USA
dc.description.abstractIn this paper, we leverage a foundational principle of analog electronic circuitry, Kirchhoff's current and voltage laws, to introduce a distinctive class of neural network models termed KirchhoffNet. Essentially, KirchhoffNet is an analog circuit that can function as a neural network, utilizing its initial node voltages as the neural network input and the node voltages at a specific time point as the output. The evolution of node voltages within the specified time is dictated by learnable parameters on the edges connecting nodes. We demonstrate that KirchhoffNet is governed by a set of ordinary differential equations (ODEs), and notably, even in the absence of traditional layers (such as convolution layers), it attains state-of-the-art performances across diverse and complex machine learning tasks. Most importantly, KirchhoffNet can be potentially implemented as a low-power analog integrated circuit, leading to an appealing property --- irrespective of the number of parameters within a KirchhoffNet, its on-chip forward calculation can always be completed within a short time. This characteristic makes KirchhoffNet a promising and fundamental paradigm for implementing large-scale neural networks, opening a new avenue in analog neural networks for AI. Our source code and model checkpoints are publicly available: https://github.com/zhengqigao/kirchhoffnet.en_US
dc.publisherACM|IEEE/ACM International Conference on Computer-Aided Designen_US
dc.relation.isversionofhttps://doi.org/10.1145/3676536.3676662en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleKirchhoffNet: A Scalable Ultra Fast Analog Neural Networken_US
dc.typeArticleen_US
dc.identifier.citationGao, Zhengqi, Sun, Fan-keng, Rohrer, Ron and Boning, Duane. 2024. "KirchhoffNet: A Scalable Ultra Fast Analog Neural Network."
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.identifier.mitlicensePUBLISHER_CC
dc.identifier.mitlicensePUBLISHER_CC
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2025-08-01T07:53:14Z
dc.language.rfc3066en
dc.rights.holderThe author(s)
dspace.date.submission2025-08-01T07:53:15Z
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record