Detail View
SEMS: Scalable Embedding Memory System for Accelerating Embedding-Based DNNs
WEB OF SCIENCE
SCOPUS
- Title
- SEMS: Scalable Embedding Memory System for Accelerating Embedding-Based DNNs
- Issued Date
- 2022-07
- Citation
- Kim, Sejin. (2022-07). SEMS: Scalable Embedding Memory System for Accelerating Embedding-Based DNNs. IEEE Computer Architecture Letters, 21(2), 157–160. doi: 10.1109/LCA.2022.3227560
- Type
- Article
- Author Keywords
- DNN accelerators ; embeddings ; recommender systems ; system for machine learning
- ISSN
- 1556-6056
- Abstract
-
Embedding layers, which are widely used in various deep learning (DL) applications, are very large in size and are increasing. We propose scalable embedding memory system (SEMS) to deal with the inference of DL applications with a large embedding layer. SEMS is built using scalable embedding memory (SEM) modules, which include FPGA for acceleration. In SEMS, PCIe bus, which is scalable and versatile, is used to expand the system memory and processing in SEMs reduces the amount of data transferred from SEMs to host, improving the effective bandwidth of PCIe. In order to achieve better performance, we apply various optimization techniques at different levels. We develop SEMlib, a Python library to provide convenience in using SEMS. We implement a proof-of-concept prototype of SEMS and using SEMS yields DLRM execution time that is 32.85x faster than that of a CPU-based system when there is a lack of DRAM to hold the entire embedding layer. © 2022 IEEE.
더보기
- Publisher
- Institute of Electrical and Electronics Engineers
File Downloads
- There are no files associated with this item.
공유
Related Researcher
- Kung, Jaeha궁재하
-
Department of Electrical Engineering and Computer Science
Total Views & Downloads
???jsp.display-item.statistics.view???: , ???jsp.display-item.statistics.download???:
