Save 37% off PRO during our Black Friday Sale! »

Probablistic Graphical Models

Probablistic Graphical Models

763cb660ed42efa5549c6c60f3d88d15?s=128

Alireza Nourian

May 06, 2013
Tweet

Transcript

  1. Probabilistic Graphical Models ALIREZA NOURIAN

  2. Meanings  Model: declarative representation of our understanding of the

    world  Experts knowledge or Learning  Probabilistic: handling uncertainty  Partial, noisy perceptions  Inherently stochastic phenomena  Graphical: relations between random variables  Bayesian Network: directed  Markov Network: undirected 2
  3. Real Models Bayesian Network Markov Network 3

  4. Image Segmentation 4

  5. Natural Language Processing  Named Entity Recognition  Joint Tagging

    5
  6. Overview  Representation  Inference  Learning 6

  7. Bayesian Networks  Knowledge Engineering  Easy to design and

    maintain  e.g. Fault Diagnosis 7
  8. Bayesian Networks – Reasoning Causal Reasoning Evidential Reasoning 8

  9. Naïve Bayes Classifier  Strong Independence Assumption  Xi ⊥

    Xj |C (i, j ∈ 1..n)  , 1 , … , = () =1 9
  10. Plate Models  Repeated Structure for Multiple Objects of the

    Same Type  Grade of student in course  Unrolled for 2 students and 2 courses 10
  11. Dynamic Bayesian Networks  Representation of Structured Distributions over Time

     Markov Assumption  0.. = 0 =0 −1 +1  Time Invariance Assumption  +1 = ′ 11
  12. Vehicle Model (Dynamic Bayesian Network) 12

  13. Hidden Markov Models  Sort of DBN  State machine

    for State values  e.g. Speech Recognition 13
  14. Markov Networks  Conditional Random Fields  Log-Linear Models 

    Metric Markov Random Field  Image Segmentation  Close pixels have same label  Image Denoising  Close pixels have same color  Stereo Reconstruction  Close points have same depth 14
  15. Markov Network – Example  Named Entity Recognition  Features:

    word capitalized, word in atlas or name list, previous word is “Mrs”, next word is “Times”, … 15
  16. Inference  Conditional Probability Queries  = =  e.g.

    Fault Diagnosis  NP-Hard problem  Algorithms  Variable Elimination  Belief Propagation  Random Sampling  Maximum a Posteriori (MAP)  argmax = =  e.g. most likely Image Segmentation  NP-Hard problem  Algorithms  Variable Elimination  Belief Propagation  Optimization 16
  17. Sample MAP – Correspondence  Model  3D Reconstruction 

    Different images  Human Pose Detection  Different scanned meshes 17
  18. Inference – Variable Elimination  Reduce factors by Evidence 

    Eliminate non-query variables  Multiply all remaining factors  Renormalize 18
  19. Variable Elimination – Induced Graph 19

  20. Variable Elimination Ordering  Greedy Search  Heuristic Cost Function

     min-neighbors: # neighbors in current graph  min-fill: number of new fill edges  … 20
  21. Robot Localization – Markov Random Field 21

  22. Robot Localization – Eliminate Poses then Landmarks 22

  23. Robot Localization – Eliminate Landmarks then Poses 23

  24. Robot Localization – Min-Fill Elimination 24

  25. Inference – Belief Propagation  Adjacent variable clusters pass information

    to each other 25
  26. Inference – Sampling  Estimating probability distribution from it’s samples

     Minimum number of samples for estimation  Forward sampling (Bayesian Networks)  Sample variable given it’s parents  Markov Chain Monte Carlo (Markov Networks)  Sample current state given previous one  Gibbs Sampling  Sample one variable given others 26
  27. Inference – Decision Making 27  Influence diagram  Bayesian

    network with action and utility nodes
  28. Inference – Belief State Tracking  Inference in Temporal Model

     e.g. Robot Localization 28
  29. Robot Localization – Belief State Tracking 29

  30. Learning Model from Data  Parameter Estimation  Maximum Likelihood

    Estimation  Structure Learning  Optimization over Structures  Likelihood of Data given Structure  Local Search in Structure Space 30
  31. Structure Learning - Local Search 31

  32. Summary  Representation  Directed and Undirected  Temporal and

    Plate Models  Inference  Exact and Approximate  Decision Making  Learning  Parameters and Structure  With and Without Complete Data 32
  33. Reference  Probabilistic Graphical Models by Daphne Koller  https://www.coursera.org/course/pgm

    33