Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Jean-Christophe Pesquet

Jean-Christophe Pesquet

(CVN, Centralesupélec)

https://s3-seminar.github.io/seminars/jean-christophe-pesquet

Title — Stochastic proximal algorithms with applications to online image recovery

Abstract — Stochastic approximation techniques have been used in various contexts in machine learning and adaptive filtering. We investigate the asymptotic behavior of a stochastic version of the forward-backward splitting algorithm for finding a zero of the sum of a maximally monotone set-valued operator and a cocoercive operator in a Hilbert space. In our general setting, stochastic approximations of the cocoercive operator and perturbations in the evaluation of the resolvents of the set-valued operator are possible. In addition, relaxations and not necessarily vanishing proximal parameters are allowed. Weak almost sure convergence properties of the iterates are established under mild conditions on the underlying stochastic processes. Leveraging on these results, we propose a stochastic version of a popular primal-dual proximal optimization algorithm, and establish its convergence. We finally show the interest of these results in an online image restoration problem.

S³ Seminar

March 24, 2017
Tweet

More Decks by S³ Seminar

Other Decks in Research

Transcript

  1. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    1/24 Stochastic Proximal Algorithms with Applications to Online Image Recovery Patrick Louis Combettes1 and Jean-Christophe Pesquet2 1 Mathematics Department, North Carolina State University, Raleigh, USA 2 Center for Visual Computing, CentraleSupelec, University Paris-Saclay, Grande Voie des Vignes, 92295 Chˆ atenay-Malabry, France S3 Seminar - 24 March 2017
  2. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    2/24 Outline 1. Introduction 2. Stochastic Forward-Backward 3. Monotone Inclusion Problems 4. Primal-Dual Extension 5. Application 6. Conclusion
  3. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    3/24 Context Need for fast optimization methods over the last decade Why?
  4. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    3/24 Context Need for fast optimization methods over the last decade Why? Interest in nonsmooth cost functions (sparsity) Need for optimal processing of massive datasets (big data) large number of variables (inverse problems) large number of observations (machine learning) Use of more sophisticated data structures (graph signal processing)
  5. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    4/24 Variational formulation GOAL: minimize x∈H f(x) + h(x), where • H: signal space (real Hilbert space) • f ∈ Γ0(H): class of convex lower-semicontinuous functions from H to ]−∞, +∞] with a nonempty domain • h: H → R: differentiable convex function such that ∇h is ϑ−1-Lipschitz continuous with ϑ ∈ ]0, +∞[ • F = Argmin(f + h) assumed to be nonempty.
  6. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    5/24 Algorithm CLASSICAL SOLUTION [Combettes and Wajs - 2005] (∀n ∈ N) xn+1 = xn + λn proxγnf (xn − γn∇h(xn)) − xn , FORWARD-BACKWARD ALGORITHM where λn ∈]0, 1], γn ∈ ]0, 2ϑ[, and proxγnf is the proximity operator of γnf [Moreau - 1965]: proxγnf : x → argmin y∈H f(y) + 1 2γn x − y 2. SPECIAL CASES: projected gradient method, iterative soft threshold- ing, Landweber algorithm,...
  7. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    5/24 Algorithm CLASSICAL SOLUTION [Combettes and Wajs - 2005] (∀n ∈ N) xn+1 = xn + λn proxγnf (xn − γn∇h(xn)) − xn , FORWARD-BACKWARD ALGORITHM In the context of online processing and machine learning, what to do if ∇h and f are not known exactly ?
  8. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    6/24 Proposed Solution (∀n ∈ N) xn+1 = xn + λn proxγnfn (xn − γnun) + an − xn , STOCHASTIC FB ALGORITHM where • λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
  9. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    6/24 Proposed Solution (∀n ∈ N) xn+1 = xn + λn proxγnfn (xn − γnun) + an − xn , STOCHASTIC FB ALGORITHM where • λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[ • fn ∈ Γ0(H): approximation to f
  10. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    6/24 Proposed Solution (∀n ∈ N) xn+1 = xn + λn proxγnfn (xn − γnun) + an − xn , STOCHASTIC FB ALGORITHM where • λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[ • fn ∈ Γ0(H): approximation to f • un second-order random variable: approximation to ∇h(xn)
  11. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    6/24 Proposed Solution (∀n ∈ N) xn+1 = xn + λn proxγnfn (xn − γnun) + an − xn , STOCHASTIC FB ALGORITHM where • λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[ • fn ∈ Γ0(H): approximation to f • un second-order random variable: approximation to ∇h(xn) • an second-order random variable: possible additional error term.
  12. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    7/24 Assumptions Let X = (Xn)n∈N be a sequence of sigma-algebras such that (∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1. where σ(x0, . . . , xn) is the smallest σ-algebra generated by x0, . . . , xn. +(X ): set of sequences of [0, +∞[-valued random variables (ξn)n∈N such that (∀n ∈ N) ξn is Xn-measurable and 1 + (X ) = (ξn)n∈N ∈ +(X ) n∈N ξn < +∞ P-a.s. ∞ + (X ) = (ξn)n∈N ∈ +(X ) sup n∈N ξn < +∞ P-a.s. .
  13. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    7/24 Assumptions Let X = (Xn)n∈N be a sequence of sigma-algebras such that (∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1. where σ(x0, . . . , xn) is the smallest σ-algebra generated by x0, . . . , xn. Assumptions on the gradient approximation: n∈N √ λn E(un |Xn) − ∇h(xn) < +∞. For every z ∈ F, there exist sequences (τn)n∈N ∈ +, (ζn(z))n∈N ∈ ∞ + (X ) such that n∈N λnζn(z) < +∞ and (∀n ∈ N) E( un − E(un |Xn) 2 |Xn) τn ∇h(xn) − ∇h(z) 2 + ζn(z).
  14. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    7/24 Assumptions Let X = (Xn)n∈N be a sequence of sigma-algebras such that (∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1. where σ(x0, . . . , xn) is the smallest σ-algebra generated by x0, . . . , xn. Assumptions on the prox approximation: There exist sequences (αn)n∈N and (βn)n∈N in [0, +∞[ such that n∈N √ λnαn < +∞, n∈N λnβn < +∞, and (∀n ∈ N)(∀x ∈ H) proxγnfn x−proxγnf x αn x +βn. n∈N λn E( an 2 |Xn) < +∞.
  15. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    7/24 Assumptions Let X = (Xn)n∈N be a sequence of sigma-algebras such that (∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1. where σ(x0, . . . , xn) is the smallest σ-algebra generated by x0, . . . , xn. Assumptions on the algorithm parameters: infn∈N γn > 0, supn∈N τn < +∞, and supn∈N (1 + τn)γn < 2ϑ. Either infn∈N λn > 0 or γn ≡ γ, n∈N τn < +∞, and n∈N λn = +∞ .
  16. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    8/24 Convergence Result Under the previous assumptions, the sequence (xn)n∈N gen- erated by the algorithm converges weakly a.s. to an F-valued random variable. REMARKS: Related works: [Rosasco et al. - 2014, Atchad´ e et al. - 2016] Result valid for non vanishing step sizes (γn)n∈N . We do not need to assume that (∀n ∈ N) E(un |Xn) = ∇h(xn). Proof based on properties of stochastic quasi-Fej´ er sequences [Combettes and Pesquet – 2015, 2016].
  17. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    9/24 Stochastic Quasi-Fej´ er Sequences Let φ: [0, +∞[ → [0, +∞[, φ(t) ↑ +∞ as t → +∞ Deterministic definition: A sequence (xn)n∈N in H is Fej´ er monotone with respect to F if for every z ∈ F, (∀n ∈ N) φ( xn+1 − z ) φ( xn − z )
  18. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    9/24 Stochastic Quasi-Fej´ er Sequences Let φ: [0, +∞[ → [0, +∞[, φ(t) ↑ +∞ as t → +∞ Stochastic definition 1: A sequence (xn)n∈N of H-valued random variables is stochastically Fej´ er monotone with respect to F if, for every z ∈ F, (∀n ∈ N) E(φ( xn+1 − z )| Xn) φ( xn − z )
  19. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    9/24 Stochastic Quasi-Fej´ er Sequences Let φ: [0, +∞[ → [0, +∞[, φ(t) ↑ +∞ as t → +∞ Stochastic definition 2: A sequence (xn)n∈N of H-valued random variables is stochastically quasi-Fej´ er monotone with respect to F if, for every z ∈ F, there exist (χn(z))n∈N ∈ 1 + (X ), (ϑn(z))n∈N ∈ +(X ), and (ηn(z))n∈N ∈ 1 + (X ) such that (∀n ∈ N) E(φ( xn+1 −z )|Xn )+ϑn (z) (1+χn (z))φ( xn −z )+ηn (z)
  20. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    9/24 Stochastic Quasi-Fej´ er Sequences Let φ: [0, +∞[ → [0, +∞[, φ(t) ↑ +∞ as t → +∞ Stochastic definition 2: A sequence (xn)n∈N of H-valued random variables is stochastically quasi-Fej´ er monotone with respect to F if, for every z ∈ F, there exist (χn(z))n∈N ∈ 1 + (X ), (ϑn(z))n∈N ∈ +(X ), and (ηn(z))n∈N ∈ 1 + (X ) such that (∀n ∈ N) E(φ( xn+1 −z )|Xn )+ϑn (z) (1+χn (z))φ( xn −z )+ηn (z) Suppose (xn)n∈N is stochastically quasi-Fej´ er monotone w.r.t. F. Then (∀z ∈ F) n∈N ϑn(z) < +∞ P-a.s. [W(xn)n∈N ⊂ F P-a.s.] ⇔ [(xn)n∈N converges weakly P-a.s. to an F-valued random variable]. W(xn)n∈N : set of weak sequential cluster points of (xn)n∈N .
  21. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    10/24 More General Problem GOAL: Find x ∈ H such that 0 ∈ Ax + Bx, where • A: H → 2H: maximally monotone operator, i.e. (x, u) ∈ gra A ⇔ (∀(y, v) ∈ gra A) x − y | u − v 0. • If A is maximally monotone, then its resolvent JA = (Id + A)−1 is a firmly nonexpansive operator from H to H.
  22. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    10/24 More General Problem GOAL: Find x ∈ H such that 0 ∈ Ax + Bx, where • A: H → 2H: maximally monotone operator, i.e. (x, u) ∈ gra A ⇔ (∀(y, v) ∈ gra A) x − y | u − v 0. • B: H → H: ϑ-cocoercive operator, with ϑ ∈ ]0, +∞[, i.e. (∀x ∈ H)(∀y ∈ H) x − y | Bx − By ϑ Bx − By 2, • F = zer (A + B) assumed to be nonempty. EXAMPLE: A = ∂f with f ∈ Γ0(H) and B = ∇h with h convex with a ϑ−1-Lipschitzian gradient.
  23. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    11/24 Proposed Solution (∀n ∈ N) xn+1 = xn + λn JγnAn (xn − γnun) + an − xn , STOCHASTIC FB ALGORITHM where • λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[
  24. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    11/24 Proposed Solution (∀n ∈ N) xn+1 = xn + λn JγnAn (xn − γnun) + an − xn , STOCHASTIC FB ALGORITHM where • λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[ • JγnAn : resolvent of a maximally monotone operator γnAn : H → 2H approximating γnA
  25. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    11/24 Proposed Solution (∀n ∈ N) xn+1 = xn + λn JγnAn (xn − γnun) + an − xn , STOCHASTIC FB ALGORITHM where • λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[ • JγnAn : resolvent of a maximally monotone operator γnAn : H → 2H approximating γnA • un second-order random variable: approximation to Bxn
  26. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    11/24 Proposed Solution (∀n ∈ N) xn+1 = xn + λn JγnAn (xn − γnun) + an − xn , STOCHASTIC FB ALGORITHM where • λn ∈ ]0, 1] and γn ∈ ]0, 2ϑ[ • JγnAn : resolvent of a maximally monotone operator γnAn : H → 2H approximating γnA • un second-order random variable: approximation to Bxn • an second-order random variable: possible additional error term
  27. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    12/24 Convergence Conditions Let X = (Xn)n∈N be a sequence of sigma-algebras such that (∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1. where σ(x0, . . . , xn) is the smallest σ-algebra generated by x0, . . . , xn. Assumptions on the approximation to the cocoercive operator: n∈N √ λn E(un |Xn) − Bxn < +∞. For every z ∈ F, there exist sequences (τn)n∈N ∈ +, (ζn(z))n∈N ∈ ∞ + (X ) such that n∈N λnζn(z) < +∞ and (∀n ∈ N) E( un − E(un |Xn) 2 |Xn) τn Bxn − Bz 2 + ζn(z).
  28. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    12/24 Convergence Conditions Let X = (Xn)n∈N be a sequence of sigma-algebras such that (∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1. where σ(x0, . . . , xn) is the smallest σ-algebra generated by x0, . . . , xn. Assumptions on the resolvent approximation: There exist sequences (αn)n∈N and (βn)n∈N in [0, +∞[ such that n∈N √ λnαn < +∞, n∈N λnβn < +∞, and (∀n ∈ N)(∀x ∈ H) JγnAn x − JγnAx αn x + βn. n∈N λn E( an 2 |Xn) < +∞.
  29. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    12/24 Convergence Conditions Let X = (Xn)n∈N be a sequence of sigma-algebras such that (∀n ∈ N) σ(x0, . . . , xn) ⊂ Xn ⊂ Xn+1. where σ(x0, . . . , xn) is the smallest σ-algebra generated by x0, . . . , xn. Assumptions on the algorithm parameters: infn∈N γn > 0, supn∈N τn < +∞, and supn∈N (1 + τn)γn < 2ϑ. Either infn∈N λn > 0 or γn ≡ γ, n∈N τn < +∞, and n∈N λn = +∞ .
  30. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    13/24 Convergence Result Under the previous assumptions, the sequence (xn)n∈N gen- erated by the algorithm converges weakly a.s. to an F-valued random variable. In addition if A or B is demiregular at every z ∈ F, then the se- quence (xn)n∈N generated by the algorithm converges strongly a.s. to an F-valued random variable. A is demiregular at x ∈ dom A if, for every sequence (xn, un)n∈N in gra A and every u ∈ Ax such that xn x and un → u, we have xn → x. Example: A strongly monotone, i.e. there exists α ∈ ]0, +∞[ such that A − αId is monotone.
  31. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    14/24 Primal-Dual Splitting GOAL: minimize x∈H f(x) + q k=1 gk(Lkx) + h(x) where • H: real Hilbert space • f ∈ Γ0(H) • h: H → R: differentiable convex function with ϑ−1-Lipschitz continuous gradient • gk ∈ Γ0(Gk) with Gk real Hilbert space • Lk : bounded linear operator from H to Gk • ∃x ∈ H such that 0 ∈ ∂f(x) + q k=1 L∗ k ∂gk(Lkx) + ∇h(x).
  32. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    15/24 Reformulation Let K = H ⊕ G with G = G1 ⊕ · · · ⊕ Gq g: G → ]−∞, +∞] : v → q k=1 gk(vk) L: H → G: x → Lkx 1 k q A: K → 2K : (x, v) → ∂f(x) + L∗v × − Lx + ∂g∗(v) B: K → K: (x, v) → ∇h(x), 0 V: K → K: (x, v) → ρ−1x − L∗v, −Lx + U−1v with U = Diag(σ1Id , . . . , σqId ) with (ρ, σ1, . . . , σq) ∈ ]0, +∞[q+1 and ρ q k=1 σk Lk 2 < 1. In the renormed space (K, · V), V−1A is maximally monotone and V−1B is cocoercive. In addition, finding a zero of the sum of these operators is equivalent to finding a pair of primal-dual solutions.
  33. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    16/24 Resulting Algorithm for n = 0, 1, . . .            yn = proxρfn xn − ρ q k=1 L∗ k vk,n + un + bn xn+1 = xn + λn(yn − xn) for k = 1, . . . , q wk,n = proxσkg∗ k vk,n + σkLk(2yn − xn) + ck,n vk,n+1 = vk,n + λn(wk,n − vk,n). STOCHASTIC PRIMAL-DUAL ALGORITHM where • λn ∈ ]0, 1] with n∈N λn = +∞ and ρ−1 − q k=1 σk Lk 2 ϑ > 1/2
  34. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    16/24 Resulting Algorithm for n = 0, 1, . . .            yn = proxρfn xn − ρ q k=1 L∗ k vk,n + un + bn xn+1 = xn + λn(yn − xn) for k = 1, . . . , q wk,n = proxσkg∗ k vk,n + σkLk(2yn − xn) + ck,n vk,n+1 = vk,n + λn(wk,n − vk,n). STOCHASTIC PRIMAL-DUAL ALGORITHM where • fn ∈ Γ0(H): approximation to f
  35. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    16/24 Resulting Algorithm for n = 0, 1, . . .            yn = proxρfn xn − ρ q k=1 L∗ k vk,n + un + bn xn+1 = xn + λn(yn − xn) for k = 1, . . . , q wk,n = proxσkg∗ k vk,n + σkLk(2yn − xn) + ck,n vk,n+1 = vk,n + λn(wk,n − vk,n). STOCHASTIC PRIMAL-DUAL ALGORITHM where • un second-order random variable: approximation to ∇h(xn)
  36. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    16/24 Resulting Algorithm for n = 0, 1, . . .            yn = proxρfn xn − ρ q k=1 L∗ k vk,n + un + bn xn+1 = xn + λn(yn − xn) for k = 1, . . . , q wk,n = proxσkg∗ k vk,n + σkLk(2yn − xn) + ck,n vk,n+1 = vk,n + λn(wk,n − vk,n). STOCHASTIC PRIMAL-DUAL ALGORITHM where • bn and cn second-order random variables: possible additional error terms
  37. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    17/24 Resulting Algorithm for n = 0, 1, . . .            yn = proxρfn xn − ρ q k=1 L∗ k vk,n + un + bn xn+1 = xn + λn(yn − xn) for k = 1, . . . , q wk,n = proxσkg∗ k vk,n + σkLk(2yn − xn) + ck,n vk,n+1 = vk,n + λn(wk,n − vk,n). STOCHASTIC PRIMAL-DUAL ALGORITHM REMARKS: Extension of the deterministic algorithms in [Esser et al – 2010] [Chambolle and Pock – 2011] [V˜ u – 2013] [Condat – 2013] Parallel structure No inversion of operators related to (Lk)1 k q required.
  38. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    18/24 Assumptions Let X = (Xn)n∈N be a sequence of sigma-algebras such that (∀n ∈ N) σ xn , vn 0 n n ⊂ Xn ⊂ Xn+1. Assumptions on the gradient approximation: n∈N √ λn E(un |Xn) − ∇h(xn) < +∞. For every z ∈ F, there exists (ζn(z))n∈N ∈ ∞ + (X ) such that n∈N λnζn(z) < +∞ and (∀n ∈ N) E( un − E(un |Xn) 2 |Xn) τn ∇h(xn) − ∇h(z) 2 + ζn(z).
  39. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    18/24 Assumptions Let X = (Xn)n∈N be a sequence of sigma-algebras such that (∀n ∈ N) σ xn , vn 0 n n ⊂ Xn ⊂ Xn+1. Assumptions on the prox approximations: There exist sequences (αn)n∈N and (βn)n∈N in [0, +∞[ such that n∈N √ λnαn < +∞, n∈N λnβn < +∞, and (∀n ∈ N)(∀x ∈ H) proxγnfn x−proxγnf x αn x +βn. n∈N λn E( bn 2 |Xn) < +∞ and n∈N λn E( cn 2 |Xn) < +∞.
  40. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    19/24 Convergence Result F: set of solutions to the primal problem F∗: set of solutions to the dual problem Under the previous assumptions, the sequence (xn)n∈N converges weakly a.s. to an F-valued random variable and the sequence (vn)n∈N converges weakly a.s. to an F∗-valued ran- dom variable.
  41. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    20/24 Online Image Recovery OBSERVATION MODEL (∀n ∈ N) zn = Knx + en, where • x ∈ H = RN : unknown image • Kn: RM×N -valued random matrix • en: RM -valued random noise vector. OBJECTIVE recover x from (Kn, zn)n∈N .
  42. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    21/24 Application of Primal-Dual Algorithm FORMULATION Mean square error criterion (∀x ∈ RN ) h(x) = 1 2 E K0x − z0 2, assuming that (Kn, zn)n∈N are identically distributed Statistics of (Kn, zn)n∈N learnt online ⇒ Approximation to ∇h(xn): un = 1 mn+1 mn+1−1 n =0 Kn (Kn xn − zn ) where (mn)n∈N is strictly increasing sequence in N f and g1 ◦ L1 (q = 1): regularization terms
  43. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    21/24 Application of Primal-Dual Algorithm Mean square error criterion (∀x ∈ RN ) h(x) = 1 2 E K0x − z0 2, assuming that (Kn, zn)n∈N are identically distributed Statistics of (Kn, zn)n∈N learnt online ⇒ Approximation to ∇h(xn): un = 1 mn+1 mn+1−1 n =0 Kn (Kn xn − zn ) where (mn)n∈N is strictly increasing sequence in N ⇒ recursive computation: un = Rnxn − cn with Rn = 1 mn+1 mn+1−1 n =0 Kn Kn = mn mn+1 Rn−1+ 1 mn+1 mn+1−1 n =mn Kn Kn .
  44. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    21/24 Application of Primal-Dual Algorithm CONDITIONS FOR CONVERGENCE (Kn, en)n∈N is an i.i.d. sequence such that E K0 4 < +∞ and E e0 4 < +∞. Approximation to ∇h(xn): un = 1 mn+1 mn+1−1 n =0 Kn (Kn xn − zn ) where (mn)n∈N is strictly increasing sequence in N such that mn = O(n1+δ) with δ ∈ ]0, +∞[. λn = O(n−κ), where κ ∈ ]1 − δ, 1] ∩ [0, 1]. fn ≡ f and the domain of f is bounded. bn ≡ 0 and cn ≡ 0.
  45. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    22/24 Simulation example Grayscale image of size 256 × 256 with pixel values in [0, 255] Stochastic blur (uniform i.i.d. subsampling of a uniform 5 × 5 blur performed in the discrete Fourier domain with 70% frequency bins set to zero). Additive white N(0, 52) noise. f = ι[0,255]N and g1 ◦ L1 = isotropic total variation. Parameter choice: (∀n ∈ N) mn = n1.1 λn = (1 + (n/500)0.95)−1.
  46. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    22/24 Simulation example Original image x Restored image (SNR = 28.1 dB) Degraded image 1 (SNR = 0.14 dB) Degraded image 2 (SNR = 12.0 dB)
  47. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    22/24 Simulation example 0 50 100 150 200 250 300 ×104 0 0.5 1 1.5 2 2.5 3 3.5 4 xn − x∞ versus the iteration number n.
  48. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    23/24 Conclusion • Investigation of stochastic variants of Forward-Backward and Primal-Dual proximal algorithms. • Stochastic approximations to both smooth and non smooth convex functions. • Extension to monotone inclusion problems. • Theoretical guaranties of convergence. • Novel application to online image recovery.
  49. Introduction Stochastic Forward-Backward Monotone Inclusion Problems Primal-Dual Extension Application Conclusion

    24/24 Some references P. L. Combettes and V. R. Wajs, Signal recovery by proximal forward-backward splitting Multiscale Model. Simul., vol. 4, pp. 1168–1200, 2005. P. L. Combettes and J.-C. Pesquet Proximal splitting methods in signal processing in Fixed-Point Algorithms for Inverse Problems in Science and Engineering, H. H. Bauschke, R. Burachik, P. L. Combettes, V. Elser, D. R. Luke, and H. Wolkowicz editors. Springer-Verlag, New York, pp. 185-212, 2011. P. Combettes and J.-C. Pesquet Stochastic quasi-Fej´ er block-coordinate fixed point iterations with random sweeping SIAM Journal on Optimization, vol. 25, no. 2, pp. 1221-1248, July 2015. N. Komodakis and J.-C. Pesquet Playing with duality: An overview of recent primal-dual approaches for solving large-scale optimization problems IEEE Signal Processing Magazine, vol. 32, no 6, pp. 31-54, Nov. 2015. P. L. Combettes and J.-C. Pesquet Stochastic approximations and perturbations in forward-backward splitting for monotone operators Pure and Applied Functional Analysis, vol. 1, no 1, pp. 13-37, Jan. 2016.