Upgrade to PRO for Only $50/Yearโ€”Limited-Time Offer! ๐Ÿ”ฅ

The Nemhauser-Trotter Reduction and Lifted Mess...

Hong Xu
June 08, 2017

The Nemhauser-Trotter Reduction and Lifted Message Passing for the Weightedย CSP

The presentation slides of the paper "Hong Xu, T. K. Satish Kumar, and Sven Koenig. The Nemhauser-Trotter reduction and lifted message passing for the weighted CSP. In the 14th International Conference on Integration of Artificial Intelligence and Operations Research Techniques in Constraint Programming (CPAIOR), 387โ€“402. 2017. doi:10.1007/978-3-319-59776-8_31."

More details: http://www.hong.me/papers/xu2017.html
Link to the published paper: https://doi.org/10.1007/978-3-319-59776-8_31

Hong Xu

June 08, 2017
Tweet

More Decks by Hong Xu

Other Decks in Research

Transcript

  1. The Nemhauser-Trotter Reduction and Lifted Message Passing for the Weighted

    CSP Hong Xu T. K. Satish Kumar Sven Koenig [email protected], [email protected], [email protected] June 8, 2017 University of Southern California the 14th International Conference on Integration of Arti๏ฌcial Intelligence and Operations Research Techniques in Constraint Programming (CPAIOR 2017) Padova, Italy
  2. Agenda The Weighted Constraint Satisfaction Problem (WCSP) The Constraint Composite

    Graph (CCG) Computational Techniques Facilitated by the CCG The Nemhauser-Trotter (NT) Reduction Min-Sum Message Passing (MSMP) Conclusion 1
  3. Executive Summary Using the Constraint Composite Graph (CCG) of a

    WCSP, โ€ข The Nemhauser-Trotter (NT) Reduction, a polynomial-time procedure, can solve about 1/8 of the benchmark instances without search. โ€ข The Min-Sum Message Passing (MSMP) algorithm, widely used in the probabilistic reasoning community, produces signi๏ฌcantly better solutions on the CCG than on the WCSPโ€™s original form. This further bridges the probabilistic reasoning and CP communities. 2
  4. Agenda The Weighted Constraint Satisfaction Problem (WCSP) The Constraint Composite

    Graph (CCG) Computational Techniques Facilitated by the CCG The Nemhauser-Trotter (NT) Reduction Min-Sum Message Passing (MSMP) Conclusion
  5. The Weighted Constraint Satisfaction Problem: Motivation Many real-world problems can

    be solved using the WCSP: โ€ข RNA motif localization (Zytnicki et al. 2008) โ€ข Communication through noisy channels using Error Correcting Codes in Information Theory (Yedidia et al. 2003) โ€ข Medical and mechanical diagnostics (Milho et al. 2000; Muscettola et al. 1998) โ€ข Energy minimization in Computer Vision (Kolmogorov 2005) โ€ข ยท ยท ยท 3
  6. Weighted Constraint Satisfaction Problem (WCSP) โ€ข N variables x =

    {X1 , X2 , . . . , XN }. โ€ข Each variable Xi has a discrete-valued domain Di . โ€ข M weighted constraints {Es1 , Es2 , . . . , EsM }. โ€ข Each constraint Es speci๏ฌes the weight for each combination of assignments of values to a subset s of the variables. โ€ข Find an optimal assignment of values to these variables so as to minimize the total weight: E(x) = M i=1 Esi (xsi ). โ€ข Known to be NP-hard. 4
  7. WCSP Example on Boolean Variables X 1 X 2 X

    3 X 2 1 0 1 0 X 3 1.0 0.6 1.3 1.1 X 1 1 0 1 0 X 3 0.7 0.4 0.9 0.8 X 1 1 0 1 0 X 2 0.7 0.5 0.6 0.3 X 1 1 0 0.2 0.7 X 3 1 0 1.0 0.1 X 2 1 0 0.8 0.3 E(X1 , X2 , X3 ) = E1 (X1 ) + E2 (X2 ) + E3 (X3 )+ E12 (X1 , X2 ) + E13 (X1 , X3 ) + E23 (X2 , X3 ) 5
  8. WCSP Example: Evaluate the Assignment X1 = 0, X2 =

    0, X3 = 1 X 1 X 2 X 3 X 2 1 0 1 0 X 3 1.0 0.6 1.3 1.1 X 1 1 0 1 0 X 3 0.7 0.4 0.9 0.8 X 1 1 0 1 0 X 2 0.7 0.5 0.6 0.3 X 1 1 0 0.2 0.7 X 3 1 0 1.0 0.1 X 2 1 0 0.8 0.3 E(X1 = 0, X2 = 0, X3 = 1) = 0.7 + 0.3 + 1.0 + 0.5 + 1.3 + 0.9 = 4.7 (This is not an optimal solution.) 6
  9. WCSP Example: Evaluate the Assignment X1 = 1, X2 =

    0, X3 = 0 X 1 X 2 X 3 X 2 1 0 1 0 X 3 1.0 0.6 1.3 1.1 X 1 1 0 1 0 X 3 0.7 0.4 0.9 0.8 X 1 1 0 1 0 X 2 0.7 0.5 0.6 0.3 X 1 1 0 0.2 0.7 X 3 1 0 1.0 0.1 X 2 1 0 0.8 0.3 E(X1 = 1, X2 = 0, X3 = 0) = 0.2 + 0.3 + 0.1 + 0.7 + 0.6 + 0.7 = 2.6 This is an optimal solution. Using brute force, it requires exponential time to ๏ฌnd. 7
  10. Agenda The Weighted Constraint Satisfaction Problem (WCSP) The Constraint Composite

    Graph (CCG) Computational Techniques Facilitated by the CCG The Nemhauser-Trotter (NT) Reduction Min-Sum Message Passing (MSMP) Conclusion
  11. Two Forms of Structure in WCSP X 1 X 2

    X 3 X 4 X 1 1 0 1 0 X 2 0.7 0.5 0.6 0.3 Numerical Structure Graphical Structure โ€ข Graphical: Which variables are in which constraints? โ€ข Numerical: How does each constraint relate the variables in it? How can we exploit both forms of structure computationally? 8
  12. Minimum Weighted Vertex Cover (MWVC) 1 2 2 0 1

    1 (a)  1 2 2 0 1 1 (b)  1 2 2 0 1 1 (c)  1 2 2 0 1 1 (d)  Each vertex is associated with a non-negative weight. Sum of the weights on the vertices in the VC is minimized. 9
  13. Projection of Minimum Weighted Vertex Cover onto an Independent Set

    X 1 + X 3 X 2 X 5 X 6 X 4 X 7 โˆž 1 1 1 1 2 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 1 1 1 1 2 3 1 = necessarily present in the vertex cover 0 = necessarily absent from the vertex cover X 1 1 0 1 0 X 4 5 4 7 6 1 (Kumar 2008, Fig. 2) 10
  14. Projection of MWVC onto an Independent Set Assuming Boolean variables

    in WCSPs โ€ข Observation: The projection of MWVC onto an independent set looks similar to a weighted constraint. โ€ข Question 1: Can we build the lifted graphical representation for any given weighted constraint? This is answered by (Kumar 2008). โ€ข Question 2: What is the bene๏ฌt of doing so? 11
  15. Lifted Representations: Example X 1 X 2 X 3 X

    2 1 0 1 0 X 3 1.0 0.6 1.3 1.1 X 1 1 0 1 0 X 3 0.7 0.4 0.9 0.8 X 1 1 0 1 0 X 2 0.7 0.5 0.6 0.3 X 1 1 0 0.2 0.7 X 3 1 0 1.0 0.1 X 2 1 0 0.8 0.3 E(X1 , X2 , X3 ) = E1 (X1 ) + E2 (X2 ) + E3 (X3 )+ E12 (X1 , X2 ) + E13 (X1 , X3 ) + E23 (X2 , X3 ) 12
  16. Lifted Representations: Example X 2 1 0 1 0 X

    3 1.0 0.6 1.3 1.1 X 1 1 0 1 0 X 3 0.7 0.4 0.9 0.8 X 1 1 0 1 0 0.7 0.5 0.6 0.3 X 1 1 0 0.2 0.7 X 3 1 0 1.0 0.1 X 2 1 0 0.8 0.3 X 1 A 4 0.2 0.7 X 2 A 5 0.8 0.3 X 3 A 6 1.0 0.1 X 1 A 1 0.2 0.5 X 2 0.1 X 2 A 2 0.4 0.6 X 3 0.7 X 1 A 3 0.3 0.4 X 3 0.5 X 2 13
  17. Constraint Composite Graph (CCG) X 1 A 1 0.7 0.5

    X 2 1.3 A 2 0.6 X 3 2.2 A 3 0.4 A 4 0.7 A 5 0.3 A 6 0.1 14
  18. MWVC on the Constraint Composite Graph (CCG) X 1 A

    1 0.7 0.5 X 2 1.3 A 2 0.6 X 3 2.2 A 3 0.4 A 4 0.7 A 5 0.3 A 6 0.1 An MWVC of the CCG encodes an optimal solution of the original WCSP! 15
  19. Agenda The Weighted Constraint Satisfaction Problem (WCSP) The Constraint Composite

    Graph (CCG) Computational Techniques Facilitated by the CCG The Nemhauser-Trotter (NT) Reduction Min-Sum Message Passing (MSMP) Conclusion
  20. Agenda The Weighted Constraint Satisfaction Problem (WCSP) The Constraint Composite

    Graph (CCG) Computational Techniques Facilitated by the CCG The Nemhauser-Trotter (NT) Reduction Min-Sum Message Passing (MSMP) Conclusion
  21. The Nemhauser-Trotter (NT) Reduction A C D B w 4

    w 3 w 1 w 2 A(w 1 ) B(w 2 ) C(w 3 ) D(w 4 ) A'(w 1 ) C'(w 3 ) D'(w 4 ) B'(w 2 ) A(w 1 ) B(w 2 ) C(w 3 ) D(w 4 ) A'(w 1 ) C'(w 3 ) D'(w 4 ) B'(w 2 ) A is in the minimum weighted VC B is not in the minimum weighted VC C and D are in the Kernel 16
  22. Experimental Evaluation: Instances โ€ข The UAI 2014 Inference Competition: PR

    and MMAP benchmark instances (Up to 10 thousands variables and constraints) โ€ข Converted to WCSP instances by taking negative logarithms normalization. โ€ข WCSP Instances from (Hurley et al. 2016) (Up to less than 1 million variables and millions of constraints) โ€ข The Probabilistic Inference Challenge 2011 โ€ข The Computer Vision and Pattern Recognition OpenGM2 benchmark โ€ข The Weighted Partial MaxSAT Evaluation 2013 โ€ข The MaxCSP 2008 Competition โ€ข The MiniZinc Challenge 2012 & 2013 โ€ข The CFLib (a library of cost function networks) โ€ข Only instances in which variables have only binary domains are used. 17
  23. Experimental Evaluation: Results 0.0 0.1 0.2 0.3 0.4 0.5 0.6

    0.7 0.8 0.9 1.0 Fraction 0 25 50 75 100 125 Number of Instances Benchmark instances from UAI 2014 Inference Competition: 19 out of 160 benchmark instances solved by the NT reduction 18
  24. Experimental Evaluation: Results 0.0 0.1 0.2 0.3 0.4 0.5 0.6

    0.7 0.8 0.9 1.0 Fraction 0 50 100 150 200 250 Number of Instances Benchmark instances from (Hurley et al. 2016): 53 out of 410 benchmark instances solved by the NT reduction 19
  25. Agenda The Weighted Constraint Satisfaction Problem (WCSP) The Constraint Composite

    Graph (CCG) Computational Techniques Facilitated by the CCG The Nemhauser-Trotter (NT) Reduction Min-Sum Message Passing (MSMP) Conclusion
  26. Min-Sum Message Passing (MSMP) Algorithms โ€ข Min-Sum Message Passing Algorithms

    โ€ข are variants of belief propagation โ€ข are widely used โ€ข have information passed locally between variables and constraints โ€ข Original MSMP Algorithm โ€ข Perform MSMP on WCSPs directly โ€ข Messages are passed between variables and constraints โ€ข Lifted MSMP Algorithm โ€ข Perform MSMP on the MWVC problem instance of the CCG โ€ข Messages are passed between adjacent vertices 20
  27. Operations on Tables: Sum X1 X2 0 1 0 1

    2 1 4 3 + X1 0 5 1 6 = X1 X2 0 1 0 1 + 5 = 6 2 + 5 = 7 1 4 + 6 = 10 3 + 6 = 9 22
  28. Original MSMP Algorithm: Message Passing for the WCSP (Xu et

    al. 2017, Fig. 1) โ€ข A message is a table over the single variable, which is the sender or the receiver. โ€ข A vertex of k neighbors 1. applies sum on the messages from its k โˆ’ 1 neighbors and internal constraint table, and 2. applies min on the summation result and sends the resulting table to its kth neighbor. 23
  29. Original MSMP Algorithm: Example X1 C12 X2 C23 X3 ฮฝ

    X1โ†’C12 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX3 โ†’C23 = 0, 0 ห† ฮฝC12 โ†’X2 = 0, 0 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ฮฝ X2โ†’C23 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ห† ฮฝC23 โ†’X3 = 0, 0 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C23โ†’X2 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX2 โ†’C12 = 0, 0 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C12โ†’X1 = 0, 0 X1 X2 0 1 0 2 3 1 1 2 (a) C12 X2 X3 0 1 0 1 4 1 2 2 (b) C23 24
  30. Original MSMP Algorithm: Example X1 C12 X2 C23 X3 ฮฝ

    X1โ†’C12 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX3 โ†’C23 = 0, 0 ห† ฮฝC12 โ†’X2 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ฮฝ X2โ†’C23 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ห† ฮฝC23 โ†’X3 = 0, 0 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C23โ†’X2 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX2 โ†’C12 = 0, 0 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C12โ†’X1 = 0, 0 X1 X2 0 1 0 2 3 1 1 2 (a) C12 X2 X3 0 1 0 1 4 1 2 2 (b) C23 24
  31. Original MSMP Algorithm: Example X1 C12 X2 C23 X3 ฮฝ

    X1โ†’C12 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX3 โ†’C23 = 0, 0 ห† ฮฝC12 โ†’X2 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ฮฝ X2โ†’C23 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ห† ฮฝC23 โ†’X3 = 0, 0 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C23โ†’X2 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX2 โ†’C12 = 0, 0 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C12โ†’X1 = 0, 0 X1 X2 0 1 0 2 3 1 1 2 (a) C12 X2 X3 0 1 0 1 4 1 2 2 (b) C23 24
  32. Original MSMP Algorithm: Example X1 C12 X2 C23 X3 ฮฝ

    X1โ†’C12 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX3 โ†’C23 = 0, 0 ห† ฮฝC12 โ†’X2 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ฮฝ X2โ†’C23 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ห† ฮฝC23 โ†’X3 = 0, 2 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C23โ†’X2 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX2 โ†’C12 = 0, 0 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C12โ†’X1 = 0, 0 X1 X2 0 1 0 2 3 1 1 2 (a) C12 X2 X3 0 1 0 1 4 1 2 2 (b) C23 24
  33. Original MSMP Algorithm: Example X1 C12 X2 C23 X3 ฮฝ

    X1โ†’C12 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX3 โ†’C23 = 0, 0 ห† ฮฝC12 โ†’X2 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ฮฝ X2โ†’C23 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ห† ฮฝC23 โ†’X3 = 0, 2 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C23โ†’X2 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX2 โ†’C12 = 0, 0 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C12โ†’X1 = 0, 0 X1 X2 0 1 0 2 3 1 1 2 (a) C12 X2 X3 0 1 0 1 4 1 2 2 (b) C23 24
  34. Original MSMP Algorithm: Example X1 C12 X2 C23 X3 ฮฝ

    X1โ†’C12 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX3 โ†’C23 = 0, 0 ห† ฮฝC12 โ†’X2 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ฮฝ X2โ†’C23 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ห† ฮฝC23 โ†’X3 = 0, 2 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C23โ†’X2 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX2 โ†’C12 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C12โ†’X1 = 0, 0 X1 X2 0 1 0 2 3 1 1 2 (a) C12 X2 X3 0 1 0 1 4 1 2 2 (b) C23 24
  35. Original MSMP Algorithm: Example X1 C12 X2 C23 X3 ฮฝ

    X1โ†’C12 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX3 โ†’C23 = 0, 0 ห† ฮฝC12 โ†’X2 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ฮฝ X2โ†’C23 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ห† ฮฝC23 โ†’X3 = 0, 2 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C23โ†’X2 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX2 โ†’C12 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C12โ†’X1 = 1, 0 X1 X2 0 1 0 2 3 1 1 2 (a) C12 X2 X3 0 1 0 1 4 1 2 2 (b) C23 24
  36. Original MSMP Algorithm: Example X1 C12 X2 C23 X3 ฮฝ

    X1โ†’C12 = 0, 0 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX3 โ†’C23 = 0, 0 ห† ฮฝC12 โ†’X2 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ฮฝ X2โ†’C23 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ห† ฮฝC23 โ†’X3 = 0, 2 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C23โ†’X2 = 0, 1 โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โ†’ ฮฝX2 โ†’C12 = 0, 1 โ† โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ โˆ’ ห† ฮฝ C12โ†’X1 = 1, 0 โ€ข X1 = 1 minimizes ห† ฮฝC12โ†’X1 (X1 ) โ€ข X2 = 0 minimizes ห† ฮฝC12โ†’X2 (X2 ) + ห† ฮฝC23โ†’X2 (X2 ) โ€ข X3 = 0 minimizes ห† ฮฝC23โ†’X3 (X3 ) โ€ข Optimal solution: X1 = 1, X2 = 0, X3 = 0 24
  37. Lifted MSMP Algorithm: Finding an MWVC on the CCG โ€ข

    Treat MWVC problems on the CCG as WCSPs and apply the MSMP algorithm on it. โ€ข Messages are simpli๏ฌed passed between adjacent vertices. 25
  38. Experimental Evaluation: Setup โ€ข Use the same benchmark instances as

    before. โ€ข Solutions are reported if the MSMP algorithms do not terminate in 5 min. โ€ข Optimal solutions are computed using toulbar2 (Hurley et al. 2016) or integer linear programming. โ€ข Experiments were performed on a GNU/Linux workstation with an Intel Xeon processor E3-1240 v3 (8MB Cache, 3.4GHz) and 16GB RAM. 26
  39. Experimental Evaluation: Results โ€” Solution Quality 100 107 1014 1021

    The Lifted MSMP Solution Quality 100 105 1010 1015 1020 The Original MSMP Solution Quality (a) Benchmark instances from the UAI 2014 Inference Competition: 126/9/18 above/below/close to the diagonal dashed line 100 104 108 1012 The Lifted MSMP Solution Quality 100 103 106 109 1012 The Original MSMP Solution Quality (b) Benchmark instances from (Hurley et al. 2016): 222/68/19 above/below/close to the di- agonal dashed line 27
  40. Experimental Evaluation: Results โ€” Solution Quality 0 < 10% โ‰ฅ

    10%, < 20% โ‰ฅ 20%, < 30% > 30% (MSMP solution - optimal solution) / optimal solution 0 25 50 75 100 125 Number of Instances Lifted MSMP Original MSMP UAI 2014 Inference Competition: Compare qualities of solution with the optimal solutions. 28
  41. Experimental Evaluation: Results โ€” Solution Quality 0 < 10% โ‰ฅ

    10%, < 20% โ‰ฅ 20%, < 30% > 30% (MSMP solution - optimal solution) / optimal solution 0 25 50 75 100 Number of Instances Lifted MSMP Original MSMP Benmark instances from (Hurley et al. 2016): Compare qualities of solution with the optimal solutions. 29
  42. Experimental Evaluation: Results โ€” Convergence Benchmark Instance Set Neither Both

    Original Lifted UAI 2014 Inference Competition 25 4 124 0 (Hurley et al. 2016) 258 7 44 0 (Xu et al. 2017, Tab. 1) โ€ข Neither: Neither of the MSMP algorithms terminates in 5 min. โ€ข Both: Both of the MSMP algorithms terminate in 5 min. โ€ข Original: Only the original MSMP algorithm terminates in 5 min. โ€ข Lifted: Only the lifted MSMP algorithm terminates in 5 min. 30
  43. Agenda The Weighted Constraint Satisfaction Problem (WCSP) The Constraint Composite

    Graph (CCG) Computational Techniques Facilitated by the CCG The Nemhauser-Trotter (NT) Reduction Min-Sum Message Passing (MSMP) Conclusion
  44. Conclusion โ€ข NT reduction on the CCG is e๏ฌ€ective for

    many benchmark instances. โ€ข The NT reduction could determine the optimal values of all variables for about 1/8 of the benchmark instances without search. โ€ข We revived the MSMP algorithm for solving the WCSP by applying it on its CCG instead of its original form. โ€ข The lifted MSMP algorithm produced solutions that are signi๏ฌcantly better than the original MSMP algorithm in general. โ€ข The lifted MSMP algorithm produced solutions that are close to optimal for a large fraction of benchmark instances. โ€ข However, the lifted MSMP algorithm is less advantageous in terms of convergence. โ€ข (Future work) Both MSMP algorithms can be easily adjusted to distributed settings. 31
  45. References I Barry Hurley, Barry Oโ€™Sullivan, David Allouche, George Katsirelos,

    Thomas Schiex, Matthias Zytnicki, and Simon de Givry. โ€œMulti-language evaluation of exact solvers in graphical model discrete optimizationโ€. In: Constraints 21.3 (2016), pp. 413โ€“434. Vladimir Kolmogorov. Primal-dual Algorithm for Convex Markov Random Fields. Tech. rep. MSR-TR-2005-117. Microsoft Research, 2005. T. K. Satish Kumar. โ€œA framework for hybrid tractability results in boolean weighted constraint satisfaction problemsโ€. In: the International Conference on Principles and Practice of Constraint Programming. Springer, 2008, pp. 282โ€“297. Isabel Milho, Ana Fred, Jorge Albano, Nuno Baptista, and Paulo Sena. โ€œAn Auxiliary System for Medical Diagnosis Based on Bayesian Belief Networksโ€. In: Portuguese Conference on Pattern Recognition. 2000. Nicola Muscettola, P. Pandurang Nayak, Barney Pell, and Brian C. Williams. โ€œRemote Agent: to boldly go where no {AI} system has gone beforeโ€. In: Arti๏ฌcial Intelligence 103.1โ€“2 (1998), pp. 5โ€“47.
  46. References II Hong Xu, T. K. Satish Kumar, and Sven

    Koenig. โ€œThe Nemhauser-Trotter Reduction and Lifted Message Passing for the Weighted CSPโ€. In: the 14th International Conference on Integration of Arti๏ฌcial Intelligence and Operations Research Techniques in Constraint Programming (CPAIOR). 2017. Jonathan S Yedidia, William T Freeman, and Yair Weiss. โ€œUnderstanding belief propagation and its generalizationsโ€. In: Exploring Arti๏ฌcial Intelligence in the New Millennium 8 (2003), pp. 236โ€“239. Matthias Zytnicki, Christine Gaspin, and Thomas Schiex. โ€œDARN! A Weighted Constraint Solver for RNA Motif Localizationโ€. In: Constraints 13.1 (2008), pp. 91โ€“109.