Detail View

TENAS: Using Taylor Expansion and Channel-level Skip Connection for Neural Architecture Search
Citations

WEB OF SCIENCE

Citations

SCOPUS

Metadata Downloads

Title
TENAS: Using Taylor Expansion and Channel-level Skip Connection for Neural Architecture Search
Issued Date
2022-08
Citation
Lim, Heechul. (2022-08). TENAS: Using Taylor Expansion and Channel-level Skip Connection for Neural Architecture Search. IEEE Access, 10, 84790–84798. doi: 10.1109/ACCESS.2022.3195208
Type
Article
Author Keywords
Aerospace electronicsComputational modelingComputer architectureConvolutionconvolutional neural networkdeep learningNeural architecture searchSearch problemsTopologyTraining
ISSN
2169-3536
Abstract
There is growing interest in automating designing good neural network architectures. The NAS methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The existing operation-level architecture search methods require a large amount of computing power or designing the search space of operations very carefully. In this paper, we investigate the possibility of achieving competitive performance with them only using a small amount of computing power and without designing search space carefully. We propose TENAS using Taylor expansion and only a fixed type of operation. The resulting architecture is sparse in terms of channel and has different topology at different cells. The experimental results for CIFAR-10 and ImageNet show that a fine-granular and sparse model searched by TENAS achieves very competitive performance with dense models searched by the existing methods. Author
URI
http://hdl.handle.net/20.500.11750/17096
DOI
10.1109/ACCESS.2022.3195208
Publisher
Institute of Electrical and Electronics Engineers Inc.
Show Full Item Record

File Downloads

  • There are no files associated with this item.

공유

qrcode
공유하기

Total Views & Downloads