Upgrade to Pro — share decks privately, control downloads, hide ads and more …

20180215_MLKitchen7_YoheiKIKUTA

yoppe
February 15, 2018

 20180215_MLKitchen7_YoheiKIKUTA

yoppe

February 15, 2018
Tweet

More Decks by yoppe

Other Decks in Science

Transcript

  1. What is NIPS? Neural Information Processing Systems (NIPS) - originally

    started as a meeting on NN - is one of the top conferences for theoretical ML 2/23
  2. Trends - GAN - Theoretical understandings of DL optimization -

    New directions of DL - Interpretablities of ML - Incomplete information games - Deep Reinforcement Learnings - Bayesian Deep Learnings - … 18/23
  3. GAN: convergence around equilibrium points Convergence analysis of GANs -

    analysis near equilibrium points - ODE analysis of gradient flows - relation to eigenvalues of Jacobian double back prop. regularization 19/23 ref: https://arxiv.org/abs/1705.10461 https://arxiv.org/abs/1706.04156
  4. GAN: an example of application 20/23 ref: https://arxiv.org/abs/1705.09368 Toward practical

    applications. Combinations of models and conditional “constraint”.
  5. Understandings of DL: generalization 21/23 ref: https://arxiv.org/abs/1705.08741 https://arxiv.org/abs/1710.06451 https://arxiv.org/abs/1706.02677 SGD

    optimizations are controlled by “noise scale”: ε : learning rate, N : training set size, B : Batch size training error
  6. New DL architecture: CapsNet 22/23 ref: https://arxiv.org/abs/1710.09829 https://www.oreilly.com/ideas/introducing-capsule-networks A kind

    of vector generalizations of neurons. - a new look of object recognitions - routing mech. to capture relations btw capsules - robust to overlapping and affine transformations capsule dim.
  7. New DL architecture: Deep Sets 23/23 ref: https://arxiv.org/abs/1703.06114 A model

    that is invariant under permutations: text input image input Digit summation experiments