Cited time in webofscience Cited time in scopus

Optimizing Write Fidelity of MRAMs by Alternating Water-filling Algorithm

Title
Optimizing Write Fidelity of MRAMs by Alternating Water-filling Algorithm
Author(s)
Kim, YongjuneJeon, YoocharnChoi, HyeokjinGuyot, CyrilCassuto, Yuval
Issued Date
2022-09
Citation
IEEE Transactions on Communications, v.70, no.9, pp.5825 - 5836
Type
Article
Author Keywords
approximate memoryapproximate storagebiconvex optimizationinformation theoryIterative algorithmsMagnetic random-access memory (MRAM)Magnetic tunnelingMagnetizationOptimizationResistanceSignal processing algorithmsSwitches
ISSN
0090-6778
Abstract
Magnetic random-access memory (MRAM) is a promising memory technology due to its high density, non-volatility, and high endurance. However, achieving high memory fidelity incurs high write-energy costs, which should be reduced for large-scale deployment of MRAMs. In this paper, we formulate a biconvex optimization problem to optimize write fidelity given energy and latency constraints. The basic idea is to allocate non-uniform write pulses depending on the importance of each bit position. The fidelity measure we consider is mean squared error (MSE), for which we optimize write pulses via alternating convex search (ACS). We derive analytic solutions and propose an alternating water-filling algorithm by casting the MRAM’s write operation as communication over parallel channels. Hence, the proposed alternating water-filling algorithm is computationally more efficient than the original ACS while their solutions are identical. Since the formulated biconvex problem is non-convex, both the original ACS and the proposed algorithm do not guarantee global optimality. However, the MSEs obtained by the proposed algorithm are comparable to the MSEs by complicated global nonlinear programming solvers. Furthermore, we prove that our algorithm can reduce the MSE exponentially with the number of bits per word. For an 8-bit accessed word, the proposed algorithm reduces the MSE by a factor of 21. We also evaluate MNIST dataset classification supposing that the model parameters of deep neural networks are stored in MRAMs. The numerical results show that the optimized write pulses can achieve 40% write-energy reduction for the same classification accuracy. IEEE
URI
http://hdl.handle.net/20.500.11750/17287
DOI
10.1109/TCOMM.2022.3190868
Publisher
Institute of Electrical and Electronics Engineers
Files in This Item:

There are no files associated with this item.

Appears in Collections:
Department of Electrical Engineering and Computer Science Information, Computing, and Intelligence Laboratory 1. Journal Articles

qrcode

  • twitter
  • facebook
  • mendeley

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE