Detail View
DeepPM: Predicting Performance and Energy Consumption of Program Binaries Using Transformers
Citations
WEB OF SCIENCE
Citations
SCOPUS
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Shim, Jun S. | - |
| dc.contributor.author | Chang, Hyeonji | - |
| dc.contributor.author | Kim, Yeseong | - |
| dc.contributor.author | Kim, Jihong | - |
| dc.date.accessioned | 2026-02-10T10:10:12Z | - |
| dc.date.available | 2026-02-10T10:10:12Z | - |
| dc.date.created | 2025-11-13 | - |
| dc.date.issued | 2025-11 | - |
| dc.identifier.issn | 1084-4309 | - |
| dc.identifier.uri | https://scholar.dgist.ac.kr/handle/20.500.11750/60010 | - |
| dc.description.abstract | Accurate estimation of performance and energy consumption is critical for optimizing application efficiency on diverse hardware platforms. Traditional methods often rely on profiling and measurements, requiring at least one execution, making them time-consuming and resource-intensive. This article introduces the Deep Power Meter (DeepPM) framework, leveraging deep learning, specifically the Transformer architecture, to predict performance and energy consumption of basic blocks directly from compiled binaries, eliminating the need for explicit measurement processes. The DeepPM model effectively learns the performance and energy consumption of basic blocks, enabling accurate predictions for each. Furthermore, the framework enhances applicability across different ISAs and microarchitectures, addressing limitations of state-of-the-art ML-based techniques restricted to specific processor architectures. Experimental results using the SPEC CPU 2017 benchmark suite show that DeepPM achieves significantly lower prediction errors compared to state-of-the-art ML-based techniques, with a 24% improvement in performance and an 18% improvement in energy consumption for x86 basic blocks, and similar gains for ARM processors. Fine-tuning with minimal data from the Phoronix Test Suite further validates DeepPM’s robustness, achieving an error of approximately 13.7%, close to the fully trained model’s 13.3% error. These findings demonstrate DeepPM’s ability to enhance the accuracy and efficiency of performance and energy consumption predictions, making it a valuable tool for optimizing computing systems across diverse hardware environments. © 2025 Elsevier B.V., All rights reserved. | - |
| dc.language | English | - |
| dc.publisher | Association for Computing Machinary | - |
| dc.title | DeepPM: Predicting Performance and Energy Consumption of Program Binaries Using Transformers | - |
| dc.type | Article | - |
| dc.identifier.doi | 10.1145/3725887 | - |
| dc.identifier.wosid | 001616616200019 | - |
| dc.identifier.scopusid | 2-s2.0-105020569979 | - |
| dc.identifier.bibliographicCitation | ACM Transactions on Design Automation of Electronic Systems, v.30, no.6 | - |
| dc.description.isOpenAccess | TRUE | - |
| dc.subject.keywordAuthor | deep learning | - |
| dc.subject.keywordAuthor | energy consumption estimation | - |
| dc.subject.keywordAuthor | Performance estimation | - |
| dc.subject.keywordAuthor | transformer | - |
| dc.subject.keywordAuthor | basic block | - |
| dc.citation.number | 6 | - |
| dc.citation.title | ACM Transactions on Design Automation of Electronic Systems | - |
| dc.citation.volume | 30 | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Computer Science | - |
| dc.relation.journalWebOfScienceCategory | Computer Science, Hardware & Architecture; Computer Science, Software Engineering | - |
| dc.type.docType | Article | - |
File Downloads
공유
Related Researcher
- Kim, Yeseong김예성
-
Department of Electrical Engineering and Computer Science
Total Views & Downloads
???jsp.display-item.statistics.view???: , ???jsp.display-item.statistics.download???:
