Upgrade to Pro — share decks privately, control downloads, hide ads and more …

A New Approximation Guarantee for Monotone Submodular Function Maximization via Discrete Convexity

Tasuku Soma
July 09, 2019
1.5k

A New Approximation Guarantee for Monotone Submodular Function Maximization via Discrete Convexity

ICALP 2018

Tasuku Soma

July 09, 2019
Tweet

More Decks by Tasuku Soma

Transcript

  1. A New Approximation Guarantee for Monotone Submodular Function Maximization via

    Discrete Convexity Tasuku Soma (UTokyo) Joint work with Yuichi Yoshida (NII) ICALP 8@Prague /
  2. Monotone submodular maximization f : V → R+: monotone submod,

    f(∅) = max f(X) sub. to |X| ≤ k Previous work • NP-hard • ( − /e)-approx by greedy [Nemhauser–Wolsey 8] • ( − /e)-approx is tight [Feige 8] • many applications in ML and marketing • extensions to matroid, knapsack, etc /
  3. Curvature curvature [Conforti–Cornuéjols 8 ] c = − min i∈V

    f(i | V − i) f(i) f(V) − f(V − i) Note c = ⇐⇒ f is linear f(i) f(i | V − i) /
  4. Curvature curvature [Conforti–Cornuéjols 8 ] c = − min i∈V

    f(i | V − i) f(i) f(V) − f(V − i) Note c = ⇐⇒ f is linear f(i) f(i | V − i) Previous work • ( − e−c)/c-approx [Conforti–Cornuéjols 8 ] • ( − c/e)-approx [Sviridenko–Vondrák–Ward ] /
  5. curvature is unsatisfactry? ex. f(X) = |X| ... c =

    − O( / √ n) ex. f(X) = min{|X|, n − } ... c = /
  6. curvature is unsatisfactry? ex. f(X) = |X| ... c =

    − O( / √ n) ex. f(X) = min{|X|, n − } ... c = curvature predicts ( − /e)-approx, but greedy finds an optimal soltuion! curvature is pessimistic /
  7. M -concave function [Murota–Shioura ] f : V → R

    is M -concave def ⇐⇒ ∀X, Y ⊆ V, i ∈ X − Y, either f(X) + f(Y) ≤ f(X − i) + f(Y + i) or ∃j ∈ Y − X s.t. f(X) + f(Y) ≤ f(X − i + j) + f(Y + i − j) i i j /
  8. M -concave function [Murota–Shioura ] f : V → R

    is M -concave def ⇐⇒ ∀X, Y ⊆ V, i ∈ X − Y, either f(X) + f(Y) ≤ f(X − i) + f(Y + i) or ∃j ∈ Y − X s.t. f(X) + f(Y) ≤ f(X − i + j) + f(Y + i − j) i i j • Tractable subclass of submod func • Greedy finds a maximizer • (weighted) matroid rank func, gross substitute, etc. /
  9. Our result • Define a new quantity γ for closedness

    to M -concave func • γ ≤ c (always better than curvature) • (under some cond) ( − γ/e)-approx algorithm 8 /
  10. curvature and decomposition (X) := i∈X f(i | V −

    i) =⇒ f = g + (g: monotone nonnegative submod) hard part easy part /
  11. curvature and decomposition (X) := i∈X f(i | V −

    i) =⇒ f = g + (g: monotone nonnegative submod) hard part easy part c: curvature ≤ f ≤ − c /
  12. The M -concave curvature Definition Given a decomposition f =

    g + h, where g: monotone nonnegative submod, hard part h: nonnegative M -concave, easy part γ(g, h) := − min X⊆V h(X) f(X) /
  13. The M -concave curvature Definition Given a decomposition f =

    g + h, where g: monotone nonnegative submod, hard part h: nonnegative M -concave, easy part γ(g, h) := − min X⊆V h(X) f(X) Remarks • γ(g, h) depends on the decomposition • γ(g, h) may be difficult to compute • No obvious way to find a nontrivial decomposition /
  14. Algorithm Theorem g: nonnegative monotone submod, h: nonnegative M concave,

    f = g + h. Then, ∃ polytime alg s.t. for any > , it finds an ( − γ/e − )-approx solution. Remarks • Generalization of ( − c/e)-approx [Sviridenko–Vondrák–Ward ], where h(X) = i∈X f(i | V − i) • continuous greedy+ellipsoid • require value oracles for g, h /
  15. Algorithm overview original max f(X) : |X| ≤ k relaxation

    max {F(x) : x ∈ P(k)} cont relax x ∈ P(k) s.t. G(x) ≥ ( − /e)g(O), ¯ h(x) = h(O) O:opt improved cont greedy [SVW ’ ]
  16. Algorithm overview original max f(X) : |X| ≤ k relaxation

    max {F(x) : x ∈ P(k)} cont relax x ∈ P(k) s.t. G(x) ≥ ( − /e)g(O), ¯ h(x) = h(O) O:opt improved cont greedy [SVW ’ ] X : |X| ≤ k s.t. g(X) ≥ ( − /e)g(O), h(X) = h(O) rounding /
  17. Two extensions of set functions f = g + h

    hard part Use the multilinear extension: G(x) = E X∼D(x) [g(X)] • g: submod =⇒ G: concave along nonnegative directions /
  18. Two extensions of set functions f = g + h

    hard part Use the multilinear extension: G(x) = E X∼D(x) [g(X)] • g: submod =⇒ G: concave along nonnegative directions easy part Use the concave closure: ¯ h(x) = max µ:E[ X ]=x E X∼µ [h(X)] • ¯ h is polyhedral concave • h: M -concave =⇒ ¯ h is polytime computable, and polytime seperation oracle for ¯ h(x) ≥ µ. [Shioura ] /
  19. Algorithm (cont. form) Guess α = g(O), β = h(O).

    x(0) x(1) x(t) v(t) dx(t) dt = v(t), x( ) = where v(t) is a solution of v ∇G(x(t)) ≥ α ¯ h(v) ≥ β i∈V v(i) ≤ k. 6 /
  20. Algorithm (cont. form) Guess α = g(O), β = h(O).

    x(0) x(1) x(t) v(t) dx(t) dt = v(t), x( ) = where v(t) is a solution of v ∇G(x(t)) ≥ α ¯ h(v) ≥ β i∈V v(i) ≤ k. Remark • v must exist. (v = O is always a solution) • Can solve in polytime (we have a separation oracle for ¯ h(v) ≥ β) 6 /
  21. Algorithm overview original max f(X) : |X| ≤ k relaxation

    max {F(x) : x ∈ P(k)} cont relax x ∈ P(k) s.t. G(x) ≥ ( − /e)g(O), ¯ h(x) = h(O) O:opt improved cont greedy [SVW ’ ] X : |X| ≤ k s.t. g(X) ≥ ( − /e)g(O), h(X) = h(O) rounding /
  22. Rounding algorithm Goal: Round x ∈ [ , ]V to

    X ⊆ V (|X| ≤ k) s.t. • g(X) ≥ G(x) (preserving multilinear extension) • h(X) ≥ ¯ h(x) (preserving concave extension) 8 /
  23. Rounding algorithm Goal: Round x ∈ [ , ]V to

    X ⊆ V (|X| ≤ k) s.t. • g(X) ≥ G(x) (preserving multilinear extension) • h(X) ≥ ¯ h(x) (preserving concave extension) Extend swap rounding [Chekuri–Vondrák–Zenklusen ] to M -concave constraint : Find convex decomposition: ¯ h(x) = n i= λih(Yi ) (|Yi | = k) : Apply “swap operations” to Y , ... , Yn and obtain X 8 /
  24. Proof of main theorem We obtain X s.t. |X| ≤

    k, E[g(X)] ≥ ( − /e)g(O), h(X) ≥ h(O) by rounding. E[f(X)] ≥ ( − /e)g(O) + h(O) = ( − /e)f(O) + h(O)/e ≥ ( − /e)f(O) + ( − γ)/e · f(O) = ( − γ/e)f(O). /
  25. Proof of main theorem We obtain X s.t. |X| ≤

    k, E[g(X)] ≥ ( − /e)g(O), h(X) ≥ h(O) by rounding. E[f(X)] ≥ ( − /e)g(O) + h(O) = ( − /e)f(O) + h(O)/e ≥ ( − /e)f(O) + ( − γ)/e · f(O) = ( − γ/e)f(O). Note • cont greedy part applies solvable+down-closed polytope as well • h can be nonmonotone /
  26. Other results Require a nontrivial decomposition f = g +

    h to beat curvature guarantee. Unfortunately, we cannot say much in the general case ... • Some heuristic for finding decomposition for quadratic submod func f(X) = XA X + b X using ultrametric fitting. • Some examples with nontrivial decomposition. /
  27. Our result • Define a new quantity γ for closedness

    to M -concave func • γ ≤ c (always better than curvature) • (under some cond) ( − γ/e)-approx algorithm Future work • Matroid constraint • Interesting application in which M -concavity significantly improves approximation /
  28. Lemma x := x( ) satisfies G(x) ≥ ( −

    /e)g(O), ¯ h(x) ≥ h(O). /
  29. Lemma x := x( ) satisfies G(x) ≥ ( −

    /e)g(O), ¯ h(x) ≥ h(O). Proof. Analysis of cont greedy: G(x) ≥ ( − /e)α = ( − /e)g(O). Jensen’s inequality: ¯ h(x( )) = ¯ h ∫ v(t)dt ≥ ∫ ¯ h(v(t))dt ≥ β = h(O). /
  30. Swap operation Let ¯ h(x) = λ h(Y ) +

    λ h(Y ). For i ∈ Y − Y , ∃j s.t. h(Y ) + h(Y ) ≤ h(Y − i + j) + h(Y + i − j) /
  31. Swap operation Let ¯ h(x) = λ h(Y ) +

    λ h(Y ). For i ∈ Y − Y , ∃j s.t. h(Y ) + h(Y ) ≤ h(Y − i + j) + h(Y + i − j) p = λ /(λ + λ ) W.p. p Y ← Y + i − j, W.p. − p Y ← Y − i + j i j − p p /
  32. Swap operation Let ¯ h(x) = λ h(Y ) +

    λ h(Y ). For i ∈ Y − Y , ∃j s.t. h(Y ) + h(Y ) ≤ h(Y − i + j) + h(Y + i − j) p = λ /(λ + λ ) W.p. p Y ← Y + i − j, W.p. − p Y ← Y − i + j i j − p p ¯ h preserved: E[¯ h(x )] − ¯ h(x) = λ λ λ + λ [h(Y − i + j) − h(Y ) + h(Y + i − j) − h(Y )] ≥ . /
  33. Swap rounding G is also preserved! zt = i λi

    Yi : vector after t swaps (t = , , ... ). z = x, zt+ − zt has at most positive coordinate and at most negative. E[zt+ | zt] = zt. /
  34. Swap rounding G is also preserved! zt = i λi

    Yi : vector after t swaps (t = , , ... ). z = x, zt+ − zt has at most positive coordinate and at most negative. E[zt+ | zt] = zt. Lemma ([Chekuri–Vondrák–Zenklusen ]) zt: vector random process satisfying the adove conditions G: multilinear extension of monotone submod func =⇒ E[G(zt)] ≥ G(x) (t = , , ... ). /