Detail View

Tweaking Deep Neural Networks
Citations

WEB OF SCIENCE

Citations

SCOPUS

Metadata Downloads

Title
Tweaking Deep Neural Networks
Issued Date
2022-09
Citation
Kim, Jinwook. (2022-09). Tweaking Deep Neural Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(9), 1–1. doi: 10.1109/TPAMI.2021.3079511
Type
Article
Author Keywords
Artificial neural networksArraysBiological neural networksDeep neural networksNeuronsSynapsessynaptic joinTrainingTraining data
Keywords
Overall accuraciesSpecific classTraining dataMultilayer neural networksDeep neural networksHidden layersHidden neuronsLearning processOR applicationsOutput neurons
ISSN
0162-8828
Abstract
Deep neural networks are trained so as to achieve a kind of the maximum overall accuracy through a learning process using given training data. Therefore, it is difficult to fix them to improve the accuracies of specific problematic classes or classes of interest that may be valuable to some users or applications. To address this issue, we propose the synaptic join method to tweak neural networks by adding certain additional synapses from the intermediate hidden layers to the output layer across layers and additionally training only these synapses, if necessary. To select the most effective synapses, the synaptic join method evaluates the goodness of all the possible candidate synapses between the hidden neurons and output neurons based on the distribution of all the possible proper weights. The experimental results show that the proposed method can effectively improve the accuracies of specific classes in a controllable way. CCBY
URI
http://hdl.handle.net/20.500.11750/15461
DOI
10.1109/TPAMI.2021.3079511
Publisher
IEEE Computer Society
Show Full Item Record

File Downloads

  • There are no files associated with this item.

공유

qrcode
공유하기

Total Views & Downloads