Indexed by:
Abstract:
Recently, memory-augmented neural networks (MANNs) have gained significant attention as a critical solution for few-shot learning (FSL). These networks leverage external memory to store prior knowledge, thereby enhancing classification efficiency. Spin-transfer torque magnetic random access memory (STT-MRAM) is particularly suited for this application due to its compact cell size, excellent data retention, and scalability. In this article, we introduce a STT-MRAM-based near-memory computing (NMC) macro specifically designed for MANNs. Our approach incorporates several key innovations aimed at overcoming challenges in hardware implementation while improving MANN performance as follows: 1) a parallel computing architecture within the NMC to expedite L1 distance computations; 2) a memory invert coding (MIC) and self-termination write (STW) scheme that reduce write operations and energy consumption, addressing the issues of frequent writes and high write currents during the training phase of MANNs; 3) a dynamic offset-compensation sense amplifier (DOC-SA) and high-throughput switch-capacitor (HTSC) readout scheme to improve read accuracy and throughput, tackling low read margins and limited readout bandwidth; 4) an exploration of MANN architectures validates the reusability of the NMC macro. The optimized matching-networks (MCHnets)-based structure achieves an accuracy exceeding 90% in five-way and eight-way Omniglot classification tasks. Fabricated with a 40-nm CMOS technology, our design achieves classification accuracies of 96.37% for eight-way-five-shot tasks and 93.72% for 16-way-five-shot tasks on the Omniglot dataset utilizing the optimized MCHnet, showcasing an impressive energy efficiency of 6.47 TOPS/W at the basis of 16-bit L1 distance computing in the classification tasks of MANN.
Keyword:
Reprint 's Address:
Email:
Source :
IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS
ISSN: 1063-8210
Year: 2025
2 . 8 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: