Save 37% off PRO during our Black Friday Sale! »

Enjoy Deep Learning
 by JavaScript

5d247ff63b1861db5e6a56d4990e5a4f?s=47 yujiosaka
August 12, 2016

Enjoy Deep Learning
 by JavaScript

5d247ff63b1861db5e6a56d4990e5a4f?s=128

yujiosaka

August 12, 2016
Tweet

Transcript

  1. Yuji Isobe Enjoy Deep Learning
 by JavaScript TokyoJS 2.1@ Abeja

    INC https://speakerdeck.com/yujiosaka/hitasurale-sitedeipuraningu
  2. [ “Node.js”, “MongoDB”, “AngularJS”, “socket.io”, “React.js”, “Emotion Intelligence “
 ]

    @yujiosaka +BWB4DSJQU
  3. emin = Emotion Intelligence ؾ࣋ͪΛղ͢ΔςΫϊϩδʔͷ୳ڀ Emotion Intelligence͸ɺʮແҙࣝͷߦಈ͔Βɺ ਓͷؾ࣋ͪͷػඍΛղ͢Δ஌ੑʯΛɺਓ޻஌ೳ͓Αͼػցֶ शͷԠ༻ٕज़Λ༻͍ͯ ։ൃ͠ɺϏδωεʹԠ༻͍ͯ͠·͢ɻ

    In search for technology to
 understand human emotion
  4. ZenClerk Series [FODMFSLMJUF ;FODMFSL *OUFSFTU8JEHFU

  5. ZenClerk provides online customers
 with exciting shopping experience, personalized by

    machine learning
 to detect their growing desire to buy.
  6. I haven’t studied ML
 before… (´ɾωɾʆ)

  7. Introduction to ML

  8. OK! OK! OK! I understand… Sounds Cool!(ɾ㱼ɾ) Bayesian probability k-nearest

    neighbors Generalized linear model Neural Network Support Vector Machine Great Wall
  9. %BUB7JTVBMJ[BUJPO .BDIJOF-FBSOJOH .BUIFNBUJDT 4UBUJTUJDT $PNQVUFS4DJFODF $PNNVOJDBUJPO %PNBJO,OPXMFEHF My skill set

    XBOUUPEFWFMPQ
  10. None
  11. None
  12. -FU`TUSZUIJT

  13. ✓ Classification of MNIST (handwritten digits data) ✓ 28 x

    28px ✓ 60,000 training data ✓ 10,000 test data 101 Digit Recognizer
  14. Aim for 99% accuracy http://yann.lecun.com/exdb/mnist/

  15. But it didn’t look fun at all

  16. I really wanted to enjoy it

  17. None
  18. ͻͨ͢Βָͯ͠FF6

  19. 1.Do not battle if not necessary 2.Do not steal items

    3.Do not pick up items Let’s Play FF6 with rules:
  20. 1.Use Deep Learning 2.Use Only JavaScript 3.Do not use machine

    learning libraries Let’s Play Kaggle with rules:
  21. Begin with Google Search

  22. IUUQOFVSBMOFUXPSLTBOEEFFQMFBSOJOHDPNJOEFYIUNM

  23. ✓ Onlinebook ✓ History from Neural Network to Deep Learning

    ✓ Example implementation by Python on GitHub Neural Networks and Deep Learning
  24. Make strategy

  25. Python→CoffeeScript→ES2015 sed & manual replacement Decaf JS &
 manual replacement

    Deep Learning Library Written in ES2015 JavaScript Babel first in NPM
  26. Isn’t Python and CoffeeScript
 very similar?

  27. Python def update_mini_batch(self, mini_batch, eta): nabla_b = [np.zeros(b.shape) for b

    in self.biases] nabla_w = [np.zeros(w.shape) for w in self.weights] for x, y in mini_batch: delta_nabla_b, delta_nabla_w = self.backprop(x, y) nabla_b = [nb+dnb for nb, dnb in zip(nabla_b, delta_nabla_b)] nabla_w = [nw+dnw for nw, dnw in zip(nabla_w, delta_nabla_w)] self.weights = [w-(eta/len(mini_batch))*nw for w, nw in zip(self.weights, nabla_w)] self.biases = [b-(eta/len(mini_batch))*nb for b, nb in zip(self.biases, nabla_b)]
  28. CoffeeScript updateMiniBatch: (miniBatch, eta) -> nablaB = (Matrix.zeros(b.rows, b.cols) for

    b in @biases) nablaW = (Matrix.zeros(w.rows, w.cols) for w in @weights) for [x, y] in miniBatch [deltaNablaB, deltaNablaW] = @backprop(x, y) nablaB = (nb.plus(dnb) for [nb, dnb] in _.zip(nablaB, deltaNablaB)) nablaW = (nw.plus(dnw) for [nw, dnw] in _.zip(nablaW, deltaNablaW)) @weights = (w.minus(nw.mulEach(eta / miniBatch.length))
 for [w, nw] in _.zip(@weights, nablaW)) @biases = (b.minus(nb.mulEach(eta / miniBatch.length))
 for [b, nb] in _.zip(@biases, nablaB))
  29. Implement Numpy’s API

  30. numpy.nan_to_num nanToNum() { let thisData = this.data, rows = this.rows,

    cols = this.cols; let row, col, result = new Array(rows); for (row=0; row<rows; ++row) { result[row] = new Array(cols); for (col=0; col<cols; ++col) { result[row][col] = n2n(thisData[row][col]); } } return new Matrix(result); };
  31. numpy.ravel ravel() { let thisData = this.data, rows = this.rows,

    cols = this.cols; let a = new Array(rows * cols); for (let i = 0, jBase = 0; i<rows; ++i, jBase += cols) { for (let j = 0; j<cols; ++j) { a[jBase + j] = thisData[i][j]; } } return a; };
  32. https://github.com/juliankrispel/decaf

  33. Manual Replacement

  34. It worked…lol

  35. It’s about time to study

  36. χϡʔϥϧωοτϫʔΫ ਆܦճ࿏໢ɺӳOFVSBMOFUXPSL // ͸ɺ೴ػೳʹݟΒΕΔ͍͔ͭ͘ͷಛੑΛܭࢉػ ্ͷγϛϡϨʔγϣϯʹΑͬͯදݱ͢Δ͜ͱΛ໨ࢦͨ͠਺ֶϞσϧͰ͋Δɻ χϡʔϥϧωοτϫʔΫ8JLJQFEJB IUUQTKBXJLJQFEJBPSHXJLJχϡʔϥϧωοτϫʔΫ What is Neural

    Network?
  37. b Perceptron Neuron Model x1 x2 x3 output w1 w2

    w3 PVUQVU JGЄKXKYKC≤
 JGЄKXKYKC
  38. 5 Perceptron Neuron Model Is the weather good? Does your


    girlfriend come? Is the place
 near stations? Go to the fest. 6 2 2 No Yes Yes No   ≤
  39. b Sigmoid Neuron Model x1 x2 x3 w1 w2 w3

    PVUQVU 
  FYQ ЄKXKYKC output
  40. Step Function (Perceptron)       

     
  41. Sigmoid Function        

    
  42. ✓ Sigmoid function can produce 0 to 1 output ✓

    Small difference of input makes that of output ✓ In other words sigmoid function is differentiable What’s the difference?
  43. Structure w + Δw
 b + Δb output + Δoutput

  44. ✓ Improve accuracy by modifying weights (w) and bias (b)

    of each neuron ( ) ✓ Techniques like Back Propagation was invented
 for that purpose. Training neurons
  45. What is Deep Learning?

  46. Neural Network

  47. Deep Learning

  48. Why so popular? ✓ New techniques has been invented recently

    ✓ It can avoid overfitting when adding layers ✓ It can improve expression by adding layers
  49. Let’s implement it

  50. Convolutional Neural Network

  51. Problem The two images are recognized different to each other

    1px
  52. Solution

  53. Structure convolutional layer pooling layer

  54. ✓ Other Activation FunctionʢSoftmax/ReLUʣ ✓ Regularization (L2 Regularization/Dropout) ✓ Cross

    Entropy Cost Function ✓ Improving weight initialization Other techniques
  55. Deep Learning is
 a set of techniques There is no

    “Deep Learning Algorithm” You can improve accuracy by assembling many techniques
 like a jigsaw puzzle
  56. Problems I encountered
 and how I overcame it

  57. Problem 1
 Allergy to mathematical expression Once I wrote the

    code, It was actually easy to understand. function sigmoid(z) { return 1 / (1 + Math.exp(-z)); } let output = sigmoid(w.dot(a).plus(b)); 
  FYQ ЄKXKYKC
  58. I copied and pasted from StackOverflow answers,
 and it actually

    worked. costDelta(y) { this.outputDropout.minus(y); } Problem 2
 I didn’t know differentiation formula
  59. Softmax causes digit overflow if you follow textbooks.
 Again, I

    got answers from StackOverflow, and it worked.
 Problem 3
 Textbook didn’t tell me let max = _.max(vector), tmp = _.map(vector, (v) => { return Math.exp(v - max); }), sum = _.sum(tmp); return _.map(tmp, (v) => { return v / sum; });
  60. It only takes 1 hour by Python reference implementation,
 but

    mine by Node.js takes more than 24 hours. I learned that Numpy does some crazy tricks for you. Problem 4
 My computing speed is too slow I used small data set in development environment
  61. Implementations with Theano and TensorFlow are
 hard to reference because

    their API’s are too advanced. WTH is automatic differentiation!? Problem 5
 Python libraries are too sophisticated I became familiar with Python libraries
  62. WIP

  63. IUUQTHJUIVCDPNZVKJPTBLBKTNJOE

  64. Demo

  65. 99.1% accuracy but it takes 24 hours to run

  66. Why did I do this?

  67. 1.To get GitHub stars (of course!) 2.To understand how deep

    learning works My initial motivations
  68. I didn’t think it was useful
 but my mind changed

  69. Sometimes you want to
 do prediction on browsers

  70. 1.You don’t have to train by JavaScript,
 but you may

    want to predict by it You can load data trained by Python,
 and use that for the prediction on browsers Promise.all([ jsmind.Netrowk.load(‘/path/to/layers.json'), jsmind.MnistLoader.loadTestDataWrapper() ]).spread(function(net, testData) { var accuracy = net.accuracy(testData); console.log('Test accuracy ' + accuracy); }); Load trained layers’ data
  71. It is useful when you do online learning on Node.js

  72. I personally don’t like language lock-in (ʆ^´)

  73. Anyone should be able to
 do ML by any languages.

  74. Let’s Enjoy Deep Learning!