Slide 1

Slide 1 text

Sum Product Networks: A New Deep Architecture ! Hoifung Poon and Pedro Domingos (2011) ! St. Louis Machine Learning & Data Science March 20, 2013 ! ! Steven Borrelli [email protected] @stevendborrelli {twitter, github}

Slide 2

Slide 2 text

No content

Slide 3

Slide 3 text

No content

Slide 4

Slide 4 text

http://deeplearningworkshopnips2010.files.wordpress.com/2010/09/nips10-workshop-tutorial-final.pdf Deep Learning

Slide 5

Slide 5 text

Sum Product Network  Hoifung Poon and Pedro Domingos, 2011 ! • Models with multiple layers of hidden variables allow for efficient inference in a much larger class of distributions. • However, deep architectures leads are difficult to learn and require approximate methods (MCMC, etc.) • Complex Networks have intractable Partition functions:

Slide 6

Slide 6 text

Sum Product Network  Hoifung Poon and Pedro Domingos, 2011 ! Mixture Feature Directed Acyclic Graph http://homes.cs.washington.edu/~pedrod/papers/nips12.pdf

Slide 7

Slide 7 text

Sum Product Network  ⊗ ⊕ 0.4 ⊗ ⊗ ⊗ 0.2 0.1 0.3 ⎯ X1 X2 X1 ⎯ X2 X1 X2 P(X) 1 1 0.4 1 0 0.2 0 1 0.1 0 0 0.3 P(X) = 0.4 ⋅ X1 ⋅ X2 ! + 0.2 ⋅ X1 ⋅ X2 ! + 0.1 ⋅ X1 ⋅ X2 ! + 0.3 ⋅ X1 ⋅ X2 ⎯ ⎯ ⎯ ⎯

Slide 8

Slide 8 text

Computing SPNs

Slide 9

Slide 9 text

Mixture model vs. SPN N * 2N-1 2N-1 ~N

Slide 10

Slide 10 text

Partition Function https://class.coursera.org/neuralnets-2012-001

Slide 11

Slide 11 text

Image recognition: Google

Slide 12

Slide 12 text

U of Toronto DRBM http://www.cs.toronto.edu/~hinton/science.pdf

Slide 13

Slide 13 text

SPN Architecture http://homes.cs.washington.edu/~pedrod/papers/nips12.pdf

Slide 14

Slide 14 text

Caltech 101 Results (MSE)

Slide 15

Slide 15 text

Caltech 101 Image completion 64 SPN DBN Nearest Neighbor DBM PCA Original

Slide 16

Slide 16 text

Potential Benefits over other Deep Architectures ! • Less complex engineering, exact inference due to tractability of partition function. • Order of magnitude faster in learning and inference • Better Mean Square Error, Precision/recall and qualitative results.