$30 off During Our Annual Pro Sale. View Details »

DARTS: Differentiable Architecture Search

August 05, 2020

DARTS: Differentiable Architecture Search

Hanxiao Liu, Karen Simonyan, Yiming Yang, DARTS: Differentiable Architecture Search, ICLR19.


August 05, 2020

More Decks by Udon

Other Decks in Research




  3. Introduction of Neural Architecture Search(1/2) What is Neural Architecture Search?

    • Neural networks and hyper-parameters are still hard to design • Related works • Hyper-parameter optimization[Saxena+, NIPS16], [Bergstra+, JMLR12] • Not architecture-level • Evolution algorithms[Stanley+, Artificial Life 09], [Floreano+, EI08] • Less practical at a large scale Need to construct general framework automatically! 
  4. Introduction of Neural Architecture Search(2/2) Neural Architecture Search with Reinforcement

    Learning[Zoph+ ICLR17] • Proposed reinforcement learning based method with a RNN controller • Not differentiable! RNN controller Heavy heavy computational costs! 
  5. a set of operations, : a weight of a node

    between i and j, apply operation O : α(i,j) o o(x) : • Objectives Cannot compute… Proposed method(1/) Define Differentiable NAS
  6.  where Hessian matrix… So computation is heavy If hyper-parameter

    = 0, then no need to compute! (Discuss later about a classification accuracy) Proposed method(2/) Define Differentiable NAS
  7.  where Approximate a finite difference O(αw) O(α + w)

    Proposed method(3/) Define Differentiable NAS
  8. Proposed method(4/) Algorithm • Alternately update and Lval Ltrain

  9. Experiments(1/) CIFAR-10 classification First order means the hyper-parameter is 0

  10. Experiments(2/) ImageNet classification Fastest!

  11. Conclusion • Achieve differentiable NAS. • Achieve the fastest time

    for training and searching. • Available my implementation from https://github.com/UdonDa/DARTS_pytorch • (Only CIFAR-10)