Upgrade to Pro — share decks privately, control downloads, hide ads and more …

20180215_MLKitchen7_YoheiKIKUTA

yoppe
February 15, 2018

 20180215_MLKitchen7_YoheiKIKUTA

yoppe

February 15, 2018
Tweet

More Decks by yoppe

Other Decks in Science

Transcript

  1. NIPS 2017 report
    Yohei KIKUTA
    resume : https://github.com/yoheikikuta/resume
    twitter: @yohei_kikuta (in Japanese)
    20180215 ML Kitchen #7
    1/23

    View Slide

  2. What is NIPS?
    Neural Information Processing Systems (NIPS)
    - originally started as a meeting on NN
    - is one of the top conferences for theoretical ML
    2/23

    View Slide

  3. Statistics of NIPS 2017
    3/23

    View Slide

  4. # of registrations
    4/23
    year

    View Slide

  5. 5/23
    Log10(# of registrations)
    year

    View Slide

  6. # of submitted papers
    3,240
    6/23

    View Slide

  7. Algorithms
    Deep Learning
    7/23
    Research topics

    View Slide

  8. # of accepted papers
    679
    8/23

    View Slide

  9. acceptance rate
    21%
    papers not posted online: 15%
    papers posted online: 29%
    9/23

    View Slide

  10. JSAI top conference reporters
    I attended NIPS as one of the reporters.
    10/23

    View Slide

  11. JSAI top conference reporters
    I attended NIPS as one of the reporters.
    It’s me.
    11/23

    View Slide

  12. Views of the conference
    12/23

    View Slide

  13. Invited talks
    13/23

    View Slide

  14. Sponsors
    14/23

    View Slide

  15. Exhibitions
    15/23

    View Slide

  16. Poster sessions
    16/23

    View Slide

  17. Interesting topics
    17/23

    View Slide

  18. Trends
    - GAN
    - Theoretical understandings of DL optimization
    - New directions of DL
    - Interpretablities of ML
    - Incomplete information games
    - Deep Reinforcement Learnings
    - Bayesian Deep Learnings
    - …
    18/23

    View Slide

  19. GAN: convergence around equilibrium points
    Convergence analysis of GANs
    - analysis near equilibrium points
    - ODE analysis of gradient flows
    - relation to eigenvalues of Jacobian
    double back prop. regularization
    19/23
    ref: https://arxiv.org/abs/1705.10461 https://arxiv.org/abs/1706.04156

    View Slide

  20. GAN: an example of application
    20/23
    ref: https://arxiv.org/abs/1705.09368
    Toward practical applications.
    Combinations of models and conditional “constraint”.

    View Slide

  21. Understandings of DL: generalization
    21/23
    ref: https://arxiv.org/abs/1705.08741 https://arxiv.org/abs/1710.06451 https://arxiv.org/abs/1706.02677
    SGD optimizations are controlled by “noise scale”:
    ε : learning rate, N : training set size, B : Batch size
    training error

    View Slide

  22. New DL architecture: CapsNet
    22/23
    ref: https://arxiv.org/abs/1710.09829 https://www.oreilly.com/ideas/introducing-capsule-networks
    A kind of vector generalizations of neurons.
    - a new look of object recognitions
    - routing mech. to capture relations btw capsules
    - robust to overlapping and affine transformations
    capsule dim.

    View Slide

  23. New DL architecture: Deep Sets
    23/23
    ref: https://arxiv.org/abs/1703.06114
    A model that is invariant under permutations:
    text input image input
    Digit summation experiments

    View Slide