| dc.contributor.author | Gao, Zhengqi | |
| dc.contributor.author | Sun, Fan-keng | |
| dc.contributor.author | Rohrer, Ron | |
| dc.contributor.author | Boning, Duane | |
| dc.date.accessioned | 2025-08-06T16:18:36Z | |
| dc.date.available | 2025-08-06T16:18:36Z | |
| dc.date.issued | 2024-10-09 | |
| dc.date.submitted | 2025-04-09 | |
| dc.identifier.isbn | 979-8-4007-1077-3 | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/162215 | |
| dc.description | ICCAD ’24, October 27–31, 2024, New York, NY, USA | |
| dc.description.abstract | In this paper, we leverage a foundational principle of analog electronic circuitry, Kirchhoff's current and voltage laws, to introduce a distinctive class of neural network models termed KirchhoffNet. Essentially, KirchhoffNet is an analog circuit that can function as a neural network, utilizing its initial node voltages as the neural network input and the node voltages at a specific time point as the output. The evolution of node voltages within the specified time is dictated by learnable parameters on the edges connecting nodes. We demonstrate that KirchhoffNet is governed by a set of ordinary differential equations (ODEs), and notably, even in the absence of traditional layers (such as convolution layers), it attains state-of-the-art performances across diverse and complex machine learning tasks. Most importantly, KirchhoffNet can be potentially implemented as a low-power analog integrated circuit, leading to an appealing property --- irrespective of the number of parameters within a KirchhoffNet, its on-chip forward calculation can always be completed within a short time. This characteristic makes KirchhoffNet a promising and fundamental paradigm for implementing large-scale neural networks, opening a new avenue in analog neural networks for AI. Our source code and model checkpoints are publicly available: https://github.com/zhengqigao/kirchhoffnet. | en_US |
| dc.publisher | ACM|IEEE/ACM International Conference on Computer-Aided Design | en_US |
| dc.relation.isversionof | https://doi.org/10.1145/3676536.3676662 | en_US |
| dc.rights | Creative Commons Attribution | en_US |
| dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | en_US |
| dc.source | Association for Computing Machinery | en_US |
| dc.title | KirchhoffNet: A Scalable Ultra Fast Analog Neural Network | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | Gao, Zhengqi, Sun, Fan-keng, Rohrer, Ron and Boning, Duane. 2024. "KirchhoffNet: A Scalable Ultra Fast Analog Neural Network." | |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | en_US |
| dc.identifier.mitlicense | PUBLISHER_CC | |
| dc.identifier.mitlicense | PUBLISHER_CC | |
| dc.eprint.version | Final published version | en_US |
| dc.type.uri | http://purl.org/eprint/type/ConferencePaper | en_US |
| eprint.status | http://purl.org/eprint/status/NonPeerReviewed | en_US |
| dc.date.updated | 2025-08-01T07:53:14Z | |
| dc.language.rfc3066 | en | |
| dc.rights.holder | The author(s) | |
| dspace.date.submission | 2025-08-01T07:53:15Z | |
| mit.license | PUBLISHER_CC | |
| mit.metadata.status | Authority Work and Publication Information Needed | en_US |