Slide 1

Slide 1 text

Probabilistic Graphical Models ALIREZA NOURIAN

Slide 2

Slide 2 text

Meanings  Model: declarative representation of our understanding of the world  Experts knowledge or Learning  Probabilistic: handling uncertainty  Partial, noisy perceptions  Inherently stochastic phenomena  Graphical: relations between random variables  Bayesian Network: directed  Markov Network: undirected 2

Slide 3

Slide 3 text

Real Models Bayesian Network Markov Network 3

Slide 4

Slide 4 text

Image Segmentation 4

Slide 5

Slide 5 text

Natural Language Processing  Named Entity Recognition  Joint Tagging 5

Slide 6

Slide 6 text

Overview  Representation  Inference  Learning 6

Slide 7

Slide 7 text

Bayesian Networks  Knowledge Engineering  Easy to design and maintain  e.g. Fault Diagnosis 7

Slide 8

Slide 8 text

Bayesian Networks – Reasoning Causal Reasoning Evidential Reasoning 8

Slide 9

Slide 9 text

Naïve Bayes Classifier  Strong Independence Assumption  Xi ⊥ Xj |C (i, j ∈ 1..n)  , 1 , … , = () =1 9

Slide 10

Slide 10 text

Plate Models  Repeated Structure for Multiple Objects of the Same Type  Grade of student in course  Unrolled for 2 students and 2 courses 10

Slide 11

Slide 11 text

Dynamic Bayesian Networks  Representation of Structured Distributions over Time  Markov Assumption  0.. = 0 =0 −1 +1  Time Invariance Assumption  +1 = ′ 11

Slide 12

Slide 12 text

Vehicle Model (Dynamic Bayesian Network) 12

Slide 13

Slide 13 text

Hidden Markov Models  Sort of DBN  State machine for State values  e.g. Speech Recognition 13

Slide 14

Slide 14 text

Markov Networks  Conditional Random Fields  Log-Linear Models  Metric Markov Random Field  Image Segmentation  Close pixels have same label  Image Denoising  Close pixels have same color  Stereo Reconstruction  Close points have same depth 14

Slide 15

Slide 15 text

Markov Network – Example  Named Entity Recognition  Features: word capitalized, word in atlas or name list, previous word is “Mrs”, next word is “Times”, … 15

Slide 16

Slide 16 text

Inference  Conditional Probability Queries  = =  e.g. Fault Diagnosis  NP-Hard problem  Algorithms  Variable Elimination  Belief Propagation  Random Sampling  Maximum a Posteriori (MAP)  argmax = =  e.g. most likely Image Segmentation  NP-Hard problem  Algorithms  Variable Elimination  Belief Propagation  Optimization 16

Slide 17

Slide 17 text

Sample MAP – Correspondence  Model  3D Reconstruction  Different images  Human Pose Detection  Different scanned meshes 17

Slide 18

Slide 18 text

Inference – Variable Elimination  Reduce factors by Evidence  Eliminate non-query variables  Multiply all remaining factors  Renormalize 18

Slide 19

Slide 19 text

Variable Elimination – Induced Graph 19

Slide 20

Slide 20 text

Variable Elimination Ordering  Greedy Search  Heuristic Cost Function  min-neighbors: # neighbors in current graph  min-fill: number of new fill edges  … 20

Slide 21

Slide 21 text

Robot Localization – Markov Random Field 21

Slide 22

Slide 22 text

Robot Localization – Eliminate Poses then Landmarks 22

Slide 23

Slide 23 text

Robot Localization – Eliminate Landmarks then Poses 23

Slide 24

Slide 24 text

Robot Localization – Min-Fill Elimination 24

Slide 25

Slide 25 text

Inference – Belief Propagation  Adjacent variable clusters pass information to each other 25

Slide 26

Slide 26 text

Inference – Sampling  Estimating probability distribution from it’s samples  Minimum number of samples for estimation  Forward sampling (Bayesian Networks)  Sample variable given it’s parents  Markov Chain Monte Carlo (Markov Networks)  Sample current state given previous one  Gibbs Sampling  Sample one variable given others 26

Slide 27

Slide 27 text

Inference – Decision Making 27  Influence diagram  Bayesian network with action and utility nodes

Slide 28

Slide 28 text

Inference – Belief State Tracking  Inference in Temporal Model  e.g. Robot Localization 28

Slide 29

Slide 29 text

Robot Localization – Belief State Tracking 29

Slide 30

Slide 30 text

Learning Model from Data  Parameter Estimation  Maximum Likelihood Estimation  Structure Learning  Optimization over Structures  Likelihood of Data given Structure  Local Search in Structure Space 30

Slide 31

Slide 31 text

Structure Learning - Local Search 31

Slide 32

Slide 32 text

Summary  Representation  Directed and Undirected  Temporal and Plate Models  Inference  Exact and Approximate  Decision Making  Learning  Parameters and Structure  With and Without Complete Data 32

Slide 33

Slide 33 text

Reference  Probabilistic Graphical Models by Daphne Koller  https://www.coursera.org/course/pgm 33