Communities & Collections
Researchers & Labs
Titles
DGIST
LIBRARY
DGIST R&D
Browsing by Titles
All
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z
All
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z
전체
가
나
다
라
마
바
사
아
자
차
카
타
파
하
가
나
다
라
마
바
사
아
자
차
카
타
파
하
All
Journal Articles
Conference Papers
Patent
News
Thesis
ETC
Showing results 1 to 2 of 2
Issue Date
Title
Author(s)
Article
Improving Adversarial Robustness via Distillation-Based Purification
Koo, Inhwa
;
Chae, Dong-Kyu
;
Lee, Sang-Chul
2023-10
Koo, Inhwa. (2023-10). Improving Adversarial Robustness via Distillation-Based Purification. Applied Sciences, 13(20). doi: 10.3390/app132011313
MDPI
View : 136
Download : 39
Article
Subset-Aware Dual-Teacher Knowledge Distillation with Hybrid Scoring for Human Activity Recognition
Park, Young-Jin
;
Cho, Hui-Sup
2025-10
Electronics, v.14, no.20
MDPI
View : 49
Download : 2
1