Alireza Nourian
May 06, 2013
120

Probablistic Graphical Models

May 06, 2013

Transcript

2. Meanings  Model: declarative representation of our understanding of the

world  Experts knowledge or Learning  Probabilistic: handling uncertainty  Partial, noisy perceptions  Inherently stochastic phenomena  Graphical: relations between random variables  Bayesian Network: directed  Markov Network: undirected 2

5

7. Bayesian Networks  Knowledge Engineering  Easy to design and

maintain  e.g. Fault Diagnosis 7

9. Naïve Bayes Classifier  Strong Independence Assumption  Xi ⊥

Xj |C (i, j ∈ 1..n)  , 1 , … , = () =1 9
10. Plate Models  Repeated Structure for Multiple Objects of the

Same Type  Grade of student in course  Unrolled for 2 students and 2 courses 10
11. Dynamic Bayesian Networks  Representation of Structured Distributions over Time

 Markov Assumption  0.. = 0 =0 −1 +1  Time Invariance Assumption  +1 = ′ 11

13. Hidden Markov Models  Sort of DBN  State machine

for State values  e.g. Speech Recognition 13
14. Markov Networks  Conditional Random Fields  Log-Linear Models 

Metric Markov Random Field  Image Segmentation  Close pixels have same label  Image Denoising  Close pixels have same color  Stereo Reconstruction  Close points have same depth 14
15. Markov Network – Example  Named Entity Recognition  Features:

word capitalized, word in atlas or name list, previous word is “Mrs”, next word is “Times”, … 15
16. Inference  Conditional Probability Queries  = =  e.g.

Fault Diagnosis  NP-Hard problem  Algorithms  Variable Elimination  Belief Propagation  Random Sampling  Maximum a Posteriori (MAP)  argmax = =  e.g. most likely Image Segmentation  NP-Hard problem  Algorithms  Variable Elimination  Belief Propagation  Optimization 16
17. Sample MAP – Correspondence  Model  3D Reconstruction 

Different images  Human Pose Detection  Different scanned meshes 17
18. Inference – Variable Elimination  Reduce factors by Evidence 

Eliminate non-query variables  Multiply all remaining factors  Renormalize 18

20. Variable Elimination Ordering  Greedy Search  Heuristic Cost Function

 min-neighbors: # neighbors in current graph  min-fill: number of new fill edges  … 20

25. Inference – Belief Propagation  Adjacent variable clusters pass information

to each other 25
26. Inference – Sampling  Estimating probability distribution from it’s samples

 Minimum number of samples for estimation  Forward sampling (Bayesian Networks)  Sample variable given it’s parents  Markov Chain Monte Carlo (Markov Networks)  Sample current state given previous one  Gibbs Sampling  Sample one variable given others 26
27. Inference – Decision Making 27  Influence diagram  Bayesian

network with action and utility nodes
28. Inference – Belief State Tracking  Inference in Temporal Model

 e.g. Robot Localization 28

30. Learning Model from Data  Parameter Estimation  Maximum Likelihood

Estimation  Structure Learning  Optimization over Structures  Likelihood of Data given Structure  Local Search in Structure Space 30

32. Summary  Representation  Directed and Undirected  Temporal and

Plate Models  Inference  Exact and Approximate  Decision Making  Learning  Parameters and Structure  With and Without Complete Data 32

33