PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework | |
Hu, Yiming1,2; Wang, Xingang1; Gu, Qingyi1 | |
刊名 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS |
2022-03-22 | |
页码 | 14 |
关键词 | Computer architecture Training Optimization Extraterrestrial measurements Estimation Computational modeling Search problems Metric neural architecture search (NAS) search space shrinking weight sharing |
ISSN号 | 2162-237X |
DOI | 10.1109/TNNLS.2022.3156373 |
通讯作者 | Gu, Qingyi(qingyi.gu@ia.ac.cn) |
英文摘要 | Neural architecture search (NAS) depends heavily on an efficient and accurate performance estimator. To speed up the evaluation process, recent advances, like differentiable architecture search (DARTS) and One-Shot approaches, instead of training every model from scratch, train a weight-sharing super-network to reuse parameters among different candidates, in which all child models can be efficiently evaluated. Though these methods significantly boost search efficiency, they inherently suffer from inaccurate and unstable performance estimation. To this end, we propose a general and effective framework for powering weight-sharing NAS, namely, PWSNAS, by shrinking search space automatically, i.e., candidate operators will be discarded if they are less important. With the strategy, our approach can provide a promising search space of a smaller size by progressively simplifying the original search space, which can reduce difficulties for existing NAS methods to find superior architectures. In particular, we present two strategies to guide the shrinking process: detect redundant operators with a new angle-based metric and decrease the degree of weight sharing of a super-network by increasing parameters, which differentiates PWSNAS from existing shrinking methods. Comprehensive analysis experiments on NASBench-201 verify the superiority of our proposed metric over existing accuracy-based and magnitude-based metrics. PWSNAS can easily apply to the state-of-the-art NAS methods, e.g., single path one-shot neural architecture search (SPOS), FairNAS, ProxylessNAS, DARTS, and progressive DARTS (PDARTS). We evaluate PWSNAS and demonstrate consistent performance gains over baseline methods. |
资助项目 | National Key Research and Development Program of China[2018YFD0400902] ; National Natural Science Foundation of China[61673376] ; Scientific Instrument Developing Project of the Chinese Academy of Science[YJKYYQ20200045] |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
WOS记录号 | WOS:000773231900001 |
资助机构 | National Key Research and Development Program of China ; National Natural Science Foundation of China ; Scientific Instrument Developing Project of the Chinese Academy of Science |
内容类型 | 期刊论文 |
源URL | [http://ir.ia.ac.cn/handle/173211/48160] |
专题 | 精密感知与控制研究中心_精密感知与控制 |
通讯作者 | Gu, Qingyi |
作者单位 | 1.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China 2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100039, Peoples R China |
推荐引用方式 GB/T 7714 | Hu, Yiming,Wang, Xingang,Gu, Qingyi. PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2022:14. |
APA | Hu, Yiming,Wang, Xingang,&Gu, Qingyi.(2022).PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,14. |
MLA | Hu, Yiming,et al."PWSNAS: Powering Weight Sharing NAS With General Search Space Shrinking Framework".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022):14. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论