Cited time in webofscience Cited time in scopus

TENAS: Using Taylor Expansion and Channel-level Skip Connection for Neural Architecture Search

Title
TENAS: Using Taylor Expansion and Channel-level Skip Connection for Neural Architecture Search
Author(s)
Lim, HeechulKim, Min-Soo
Issued Date
2022-08
Citation
IEEE Access, v.10, pp.84790 - 84798
Type
Article
Author Keywords
Aerospace electronicsComputational modelingComputer architectureConvolutionconvolutional neural networkdeep learningNeural architecture searchSearch problemsTopologyTraining
ISSN
2169-3536
Abstract
There is growing interest in automating designing good neural network architectures. The NAS methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The existing operation-level architecture search methods require a large amount of computing power or designing the search space of operations very carefully. In this paper, we investigate the possibility of achieving competitive performance with them only using a small amount of computing power and without designing search space carefully. We propose TENAS using Taylor expansion and only a fixed type of operation. The resulting architecture is sparse in terms of channel and has different topology at different cells. The experimental results for CIFAR-10 and ImageNet show that a fine-granular and sparse model searched by TENAS achieves very competitive performance with dense models searched by the existing methods. Author
URI
http://hdl.handle.net/20.500.11750/17096
DOI
10.1109/ACCESS.2022.3195208
Publisher
Institute of Electrical and Electronics Engineers Inc.
Files in This Item:

There are no files associated with this item.

Appears in Collections:
Department of Electrical Engineering and Computer Science InfoLab 1. Journal Articles

qrcode

  • twitter
  • facebook
  • mendeley

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE