Detail View

OPAL: Outlier-Preserved Microscaling Quantization Accelerator for Generative Large Language Models
Citations

WEB OF SCIENCE

Citations

SCOPUS

Metadata Downloads

Title
OPAL: Outlier-Preserved Microscaling Quantization Accelerator for Generative Large Language Models
Issued Date
2024-06-25
Citation
Koo, Jahyun. (2024-06-25). OPAL: Outlier-Preserved Microscaling Quantization Accelerator for Generative Large Language Models. Design Automation Conference, 1–6. doi: 10.1145/3649329.3657323
Type
Conference Paper
ISBN
9798400706011
ISSN
0738-100X
Abstract
To overcome the burden on the memory size and bandwidth due to ever-increasing size of large language models (LLMs), aggressive weight quantization has been recently studied, while lacking research on quantizing activations. In this paper, we present a hardware-software co-design method that results in an energy-efficient LLM accelerator, named OPAL, for generation tasks. First of all, a novel activation quantization method that leverages the microscaling data format while preserving several outliers per subtensor block (e.g., four out of 128 elements) is proposed. Second, on top of preserving outliers, mixed precision is utilized that sets 5-bit for inputs to sensitive layers in the decoder block of an LLM, while keeping inputs to less sensitive layers to 3-bit. Finally, we present the OPAL hardware architecture that consists of FP units for handling outliers and vectorized INT multipliers for dominant non-outlier related operations. In addition, OPAL uses log2-based approximation on softmax operations that only requires shift and subtraction to maximize power efficiency. As a result, we are able to improve the energy efficiency by 1.6∼2.2×, and reduce the area by 2.4∼3.1× with negligible accuracy loss, i.e., <1 perplexity increase. © 2024 Institute of Electrical and Electronics Engineers Inc.. All rights reserved.
URI
http://hdl.handle.net/20.500.11750/57857
DOI
10.1145/3649329.3657323
Publisher
Institute of Electrical and Electronics Engineers Inc.
Show Full Item Record

File Downloads

  • There are no files associated with this item.

공유

qrcode
공유하기

Total Views & Downloads