Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Probablistic Graphical Models

Sponsored · Your Podcast. Everywhere. Effortlessly. Share. Educate. Inspire. Entertain. You do you. We'll handle the rest.

Probablistic Graphical Models

Avatar for Alireza Nourian

Alireza Nourian

May 06, 2013
Tweet

More Decks by Alireza Nourian

Other Decks in Education

Transcript

  1. Meanings  Model: declarative representation of our understanding of the

    world  Experts knowledge or Learning  Probabilistic: handling uncertainty  Partial, noisy perceptions  Inherently stochastic phenomena  Graphical: relations between random variables  Bayesian Network: directed  Markov Network: undirected 2
  2. Naïve Bayes Classifier  Strong Independence Assumption  Xi ⊥

    Xj |C (i, j ∈ 1..n)  , 1 , … , = () =1 9
  3. Plate Models  Repeated Structure for Multiple Objects of the

    Same Type  Grade of student in course  Unrolled for 2 students and 2 courses 10
  4. Dynamic Bayesian Networks  Representation of Structured Distributions over Time

     Markov Assumption  0.. = 0 =0 −1 +1  Time Invariance Assumption  +1 = ′ 11
  5. Hidden Markov Models  Sort of DBN  State machine

    for State values  e.g. Speech Recognition 13
  6. Markov Networks  Conditional Random Fields  Log-Linear Models 

    Metric Markov Random Field  Image Segmentation  Close pixels have same label  Image Denoising  Close pixels have same color  Stereo Reconstruction  Close points have same depth 14
  7. Markov Network – Example  Named Entity Recognition  Features:

    word capitalized, word in atlas or name list, previous word is “Mrs”, next word is “Times”, … 15
  8. Inference  Conditional Probability Queries  = =  e.g.

    Fault Diagnosis  NP-Hard problem  Algorithms  Variable Elimination  Belief Propagation  Random Sampling  Maximum a Posteriori (MAP)  argmax = =  e.g. most likely Image Segmentation  NP-Hard problem  Algorithms  Variable Elimination  Belief Propagation  Optimization 16
  9. Sample MAP – Correspondence  Model  3D Reconstruction 

    Different images  Human Pose Detection  Different scanned meshes 17
  10. Inference – Variable Elimination  Reduce factors by Evidence 

    Eliminate non-query variables  Multiply all remaining factors  Renormalize 18
  11. Variable Elimination Ordering  Greedy Search  Heuristic Cost Function

     min-neighbors: # neighbors in current graph  min-fill: number of new fill edges  … 20
  12. Inference – Sampling  Estimating probability distribution from it’s samples

     Minimum number of samples for estimation  Forward sampling (Bayesian Networks)  Sample variable given it’s parents  Markov Chain Monte Carlo (Markov Networks)  Sample current state given previous one  Gibbs Sampling  Sample one variable given others 26
  13. Learning Model from Data  Parameter Estimation  Maximum Likelihood

    Estimation  Structure Learning  Optimization over Structures  Likelihood of Data given Structure  Local Search in Structure Space 30
  14. Summary  Representation  Directed and Undirected  Temporal and

    Plate Models  Inference  Exact and Approximate  Decision Making  Learning  Parameters and Structure  With and Without Complete Data 32