Detail View
TENAS: Using Taylor Expansion and Channel-level Skip Connection for Neural Architecture Search
WEB OF SCIENCE
SCOPUS
- Title
- TENAS: Using Taylor Expansion and Channel-level Skip Connection for Neural Architecture Search
- Issued Date
- 2022-08
- Citation
- Lim, Heechul. (2022-08). TENAS: Using Taylor Expansion and Channel-level Skip Connection for Neural Architecture Search. IEEE Access, 10, 84790–84798. doi: 10.1109/ACCESS.2022.3195208
- Type
- Article
- Author Keywords
- Aerospace electronics ; Computational modeling ; Computer architecture ; Convolution ; convolutional neural network ; deep learning ; Neural architecture search ; Search problems ; Topology ; Training
- ISSN
- 2169-3536
- Abstract
-
There is growing interest in automating designing good neural network architectures. The NAS methods proposed recently have significantly reduced the architecture search cost by sharing parameters, but there is still a challenging problem in designing search space. The existing operation-level architecture search methods require a large amount of computing power or designing the search space of operations very carefully. In this paper, we investigate the possibility of achieving competitive performance with them only using a small amount of computing power and without designing search space carefully. We propose TENAS using Taylor expansion and only a fixed type of operation. The resulting architecture is sparse in terms of channel and has different topology at different cells. The experimental results for CIFAR-10 and ImageNet show that a fine-granular and sparse model searched by TENAS achieves very competitive performance with dense models searched by the existing methods. Author
더보기
- Publisher
- Institute of Electrical and Electronics Engineers Inc.
File Downloads
- There are no files associated with this item.
공유
Total Views & Downloads
???jsp.display-item.statistics.view???: , ???jsp.display-item.statistics.download???:
