Stochastic In-memory Computing Using Magnetic Tunnel Junctions
Author(s)
Wang, Qiuyuan
DownloadThesis PDF (4.252Mb)
Advisor
Liu, Luqiao
Terms of use
Metadata
Show full item recordAbstract
Current computing hardware based on von Neumann architecture and digital CMOS circuits face strong challenges to further scale up for big AI models and data-centric applications. However, while being actively studied, it is still not clear which alternative computing paradigm is the best solution considering the fabrication maturity, scalability, operation conditions, cost, power/area efficiency, and so on. In this thesis, we propose a new alternative computing framework – stochastic in-memory computing using magnetic tunnel junctions. By introducing thermally stable and unstable magnetic tunnel junctions as CMOS-compatible circuit building blocks, both general-purpose and application-specific in-memory computing accelerators can be synthesized, providing a versatile and very high-efficiency hardware design framework for multiple applications. A deep learning accelerator is implemented and benchmarked on FPGA following the proposed stochastic in-memory computing architecture, with stochastic bitstreams sampled from thermally unstable magnetic tunnel junction fabricated in lab. The hardware designs for a Bayesian inference accelerator and Ising machine are also provided. Our results show magnetic tunnel junctions could open up rich design space for future computing hardware.
Date issued
2024-05Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology