Upgrade to Pro — share decks privately, control downloads, hide ads and more …

How Deep is Deep Learning?

Amar
July 09, 2017

How Deep is Deep Learning?

Undoubtedly Deep Learning is a recent significant step towards Artificial General Intelligence because of its sheer ability to learn most complex tasks. Deep Learning has been shown to achieve spectacular results in almost all domains. But as expected, there is always a price to pay for everything, especially for better things. And here the price is the interpretability and simplicity. Moreover, the amount of resources required by deep learning is huge, but that is not much of a concern in today’s era. With such huge promises, deep learning has become the panacea. Is it really true? How deep is Deep Learning? We explore such questions and will discuss some interesting findings and insights when deep learning is applied in education domain (Knowledge Tracing, to be precise).

Amar

July 09, 2017
Tweet

More Decks by Amar

Other Decks in Technology

Transcript

  1. How Deep is Deep Learning? Amar Lalwani Lead Engineer, R

    & D, funtoot Ph.D. Candidate, IIIT-Bangalore
  2. Funtoot: Intelligent Tutoring System • Every child is unique •

    Personalised (One-on-One) Tutoring • Mastery Learning
  3. Student Data 1. Q1 => solved 2. Q2 => unsolved

    3. Q3 => solved 4. Q1 => unsolved 5. Q3 => solved 6. Q4 => unsolved 7. Q1 => solved 8. Q2 => unsolved
  4. Knowledge Tracing (KT) • For some skill K • Given

    student’s response sequence 1 to n, predict n+1 0 0 0 1 1 1 ? 1 ………..……… n n+1 Chronological response sequence for student Y [ 0 = Incorrect response 1 = Correct response]
  5. How do we approach this? • Modelling learner’s knowledge acquisition

    process • Fairly complex • Need a • General model • Flexible model • Powerful model
  6. Deep Knowledge Tracing (DKT) • RNN or LSTM Model 0.9,0.3,0.2

    0.8,0.2,0.1 0.8,0.5,0.3 Q1 Q2 Q3 …. Skill A Skill B Skill C 1.0,0.3,0.7 pCorrect(Skill A), pCorrect(Skill B), pCorrect(Skill C)
  7. Dataset • 6th Grade Math CBSE Curriculum • 22 topics,

    69 sub-topics, 119 sub-sub-topics • 442 skills (LGs), 1523 problems • 7780 students, 176 schools • 2.4 million problem attempts • 5.6 million data-points • 76% avoidances (positive class:1)
  8. Bayesian Knowledge Tracing (BKT) Learned (know) UnLearned (Does not know)

    Incorrect Correct P(L0 ) 1-P(L0 ) P(T) 1-P(G) 1-P(S) P(G) P(S)
  9. BKT: Parameters • BKT: 2-state Hidden Markov Model (HMM) •

    P(L0 ): Probability of Initial Knowledge • P(T): Probability of Learning • P(S): Probability of Slip • P(G): Probability of Guess
  10. Shallow* Vs Deep Shallow* Deep Shallow* = Deep Performance Parameters

    4 x # skills pInit, pLearn, pGuess, pSlip Few hundred thousand parameters Interpretability
  11. Deep Model: Advantages • Intelligent Curriculum Design • Finding best

    sequence of tasks • Discovery of structure • Instead of skill labels, question labels can be used as input • Complex representations and features
  12. References • Knowledge Tracing: Modelling the acquisition of procedural knowledge

    (Corbett et. Al., 1995) • Bloom’s Two Sigma Problem (1984) • Deep Knowledge Tracing (Peich et. Al., 2015) • How deep is Knowledge Tracing? (Khajah et. Al., 2016) • Few hundred parameters outperform few hundred thousand? (Lalwani et. Al.), 2017