Detail View

Subset-Aware Dual-Teacher Knowledge Distillation with Hybrid Scoring for Human Activity Recognition
Citations

WEB OF SCIENCE

Citations

SCOPUS

Metadata Downloads

DC Field Value Language
dc.contributor.author Park, Young-Jin -
dc.contributor.author Cho, Hui-Sup -
dc.date.accessioned 2025-11-12T17:07:52Z -
dc.date.available 2025-11-12T17:07:52Z -
dc.date.created 2025-10-22 -
dc.date.issued 2025-10 -
dc.identifier.issn 2079-9292 -
dc.identifier.uri https://scholar.dgist.ac.kr/handle/20.500.11750/59159 -
dc.description.abstract Human Activity Recognition (HAR) is a key technology with applications in healthcare, security, smart environments, and sports analytics. Despite advances in deep learning, challenges remain in building models that are both efficient and generalizable due to the large scale and variability of video data. To address these issues, we propose a novel Dual-Teacher Knowledge Distillation (DTKD) framework tailored for HAR. The framework introduces three main contributions. First, we define static and dynamic activity classes in an objective and reproducible manner using optical-flow-based indicators, establishing a quantitative classification scheme based on motion characteristics. Second, we develop subset-specialized teacher models and design a hybrid scoring mechanism that combines teacher confidence with cross-entropy loss. This enables dynamic weighting of teacher contributions, allowing the student to adaptively balance knowledge transfer across heterogeneous activities. Third, we provide a comprehensive evaluation on the UCF101 and HMDB51 benchmarks. Experimental results show that DTKD consistently outperforms baseline models and achieves balanced improvements across both static and dynamic subsets. These findings validate the effectiveness of combining subset-aware teacher specialization with hybrid scoring. The proposed approach improves recognition accuracy and robustness, offering practical value for real-world HAR applications such as driver monitoring, healthcare, and surveillance. -
dc.language English -
dc.publisher MDPI -
dc.title Subset-Aware Dual-Teacher Knowledge Distillation with Hybrid Scoring for Human Activity Recognition -
dc.type Article -
dc.identifier.doi 10.3390/electronics14204130 -
dc.identifier.wosid 001601430100001 -
dc.identifier.scopusid 2-s2.0-105020243642 -
dc.identifier.bibliographicCitation Electronics, v.14, no.20 -
dc.description.isOpenAccess TRUE -
dc.subject.keywordAuthor deep learning applications -
dc.subject.keywordAuthor Human Activity Recognition (HAR) -
dc.subject.keywordAuthor knowledge distillation -
dc.citation.number 20 -
dc.citation.title Electronics -
dc.citation.volume 14 -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.relation.journalResearchArea Computer Science; Engineering; Physics -
dc.relation.journalWebOfScienceCategory Computer Science, Information Systems; Engineering, Electrical & Electronic; Physics, Applied -
dc.type.docType Article -
Show Simple Item Record

File Downloads

공유

qrcode
공유하기

Related Researcher

조희섭
Cho, Hui-Sup조희섭

Division of AI, Big data and Block chain

read more

Total Views & Downloads