One-Shot Neural Architecture Search: Maximising Diversity to Overcome Catastrophic Forgetting

Abstract

This work addresses the catastrophic forgetting problem in one-shot neural architecture search by treating supernet training as a constrained optimization problem. The proposed method uses a novelty search-based architecture selection approach to enhance diversity and boost performance, achieving competitive results on CIFAR-10 and ImageNet datasets.

Publication
IEEE Transactions on Pattern Analysis and Machine Intelligence
Xiaojun Chang
Xiaojun Chang
常晓军教授/主任

My research interests include Artificial Intelligence, Machine Learning and Multimedia.