MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Long Live TIME: Improving Lifetime and Security for NVM-Based Training-in-Memory Systems

Author(s)
Lin, Yujun; Han, Song
Thumbnail
DownloadIEEE TCAD_302.pdf (2.511Mb)
Open Access Policy

Open Access Policy

Creative Commons Attribution-Noncommercial-Share Alike

Terms of use
Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/
Metadata
Show full item record
Abstract
Nonvolatile memory (NVM)-based training-in-memory (TIME) systems have emerged that can process the neural network (NN) training in an energy-efficient manner. However, the endurance of NVM cells is disappointing, rendering concerns about the lifetime of TIME systems, because the weights of NN models always need to be updated for thousands to millions of times during training. Gradient sparsification (GS) can alleviate this problem by preserving only a small portion of the gradients to update the weights. However, conventional GS will introduce nonuniform writes on different cells across the whole NVM crossbars, which significantly reduces the excepted available lifetime. Moreover, an adversary can easily launch malicious training tasks to exactly wear-out the target cells and fast break down the system. In this article, we propose an efficient and effective framework, referred as SGS-ARS, to improve the lifetime and security of TIME systems. The framework mainly contains a structured GS (SGS) scheme for reducing the write frequency, and an aging-aware row swapping (ARS) scheme to make the writes uniform. Meanwhile, we show that the back-propagation mechanism allows the attacker to localize and update fixed memory locations and wear them out. Therefore, we introduce Random-ARS and Refresh techniques to thwart adversarial training attacks, preventing the systems from being fast broken in an extremely short time. Our experiments show that when TIME is programmed to train ResNet-50 on ImageNet dataset, 356× lifetime extension can be achieved without sacrificing the accuracy much or incurring much hardware overhead. Under the adversarial environment, the available lifetime of TIME systems can still be improved by 84× .
Date issued
2020-12
URI
https://hdl.handle.net/1721.1/129440
Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Journal
IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Publisher
IEEE
Citation
Cai, Yi et al. “Long Live TIME: Improving Lifetime and Security for NVM-Based Training-in-Memory Systems.” IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 39, 12 (December 2020): 4707 - 4720 © 2020 The Author(s)
Version: Author's final manuscript
ISSN
0278-0070

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.