Neural Architecture Search of SPD Manifold Networks Conference Paper uri icon

abstract

  • In this paper, we propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks, aiming to automate the design of SPD neural architectures. To address this problem, we first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design. Further, we model our new NAS problem with a one-shot training process of a single supernet. Based on the supernet modeling, we exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search. Statistical evaluation of our method on drone, action, and emotion recognition tasks mostly provides better results than the state-of-the-art SPD networks and traditional NAS algorithms. Empirical results show that our algorithm excels in discovering better performing SPD network design and provides models that are more than three times lighter than searched by the state-of-the-art NAS algorithms.

name of conference

  • Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence

published proceedings

  • Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence

author list (cited authors)

  • Sukthanker, R. S., Huang, Z., Kumar, S., Goron Endsjo, E., Wu, Y., & Van Gool, L.

citation count

  • 4

complete list of authors

  • Sukthanker, Rhea Sanjay||Huang, Zhiwu||Kumar, Suryansh||Goron Endsjo, Erik||Wu, Yan||Van Gool, Luc

editor list (cited editors)

  • Zhou, Z.

publication date

  • August 2021