Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Audrey Repetti

Audrey Repetti

(Heriot-Watt university, Edinburgh)

https://s3-seminar.github.io/seminars/audrey-repetti

Title — Solving large-scale inverse problems using forward-backward based methods

Abstract — Recent developments in imaging and data analysis techniques came along with an increasing need for fast convex optimization methods for solving large scale problems. A simple optimization strategy to minimize the sum of a Lipschitz differentiable function and a non smooth function is the forward-backward algorithm. In this presentation, several approaches to accelerate convergence speed and to reduce complexity of this algorithm will be proposed. More precisely, in a first part, preconditioning methods adapted to non convex minimization problems will be presented, and in a second part, stochastic optimization techniques will be described in the context of convex optimization. The different proposed methods will be used to solve several inverse problems in signal and image processing.

Biography — Audrey Repetti is a post-doctoral researcher at the Heriot-Watt university, in Scotland. She received her M.Sc. degree from the Université Pierre et Marie Curie (Paris VI) in applied mathematics, and her Ph.D. degree from the Université Paris-Est Marne-la-Vallée in signal and image processing. Her research interests include convex and non convex optimization, and signal and image processing.

S³ Seminar

March 11, 2016
Tweet

More Decks by S³ Seminar

Other Decks in Research

Transcript

  1. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 1/44 Solving large-scale inverse problems using forward-backward based methods Audrey Repetti⋆, Emilie Chouzenoux†, and Jean-Christophe Pesquet† ⋆ Heriot-Watt University – Edinburgh – UK † Universit´ e Paris-Est Marne la Vall´ ee – France 11 March 2016 – L2S
  2. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 2/44 In collaboration with E. Chouzenoux J.-C. Pesquet
  3. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 3/44 Motivation: Inverse problems z = H Observation matrix x Original signal + w Additive noise Objective: Find an estimation ˆ x ∈ RN of x from z. Observation model
  4. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 3/44 Motivation: Inverse problems z = H Observation matrix x Original signal + w Additive noise Objective: Find an estimation ˆ x ∈ RN of x from z. Observation model x z ˆ x
  5. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 3/44 Motivation: Inverse problems z = H Observation matrix x Original signal + w Additive noise Objective: Find an estimation ˆ x ∈ RN of x from z. Observation model x z ˆ x
  6. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 4/44 Motivation: Variational formulation Define estimate ˆ x as a solution to Argmin x∈RN f1(x) + f2(x). Minimization problem ⋆ f1 is a data fidelity term related to the observation model ⋆ f2 is a regularization term related to some a priori assumptions on the target solution • e.g. an a priori on the smoothness of an image, • e.g. an a priori on the sparsity of a signal, • e.g. a support constraint, • etc...
  7. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 4/44 Motivation: Variational formulation Define estimate ˆ x as a solution to Argmin x∈RN f1(x) + f2(x). Minimization problem ⋆ f1 is a data fidelity term related to the observation model ⋆ f2 is a regularization term related to some a priori assumptions on the target solution In the context of large scale problems, how to find an optimization algorithm able to deliver a reliable numerical solution in a reasonable time, with low memory requirement ? ?
  8. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 5/44 Minimization problem Find x ∈ Argmin x∈RN f(x) = h(x) + g(x) Optimization problem where ◮ g: RN →] − ∞, +∞] is proper, lsc, bounded from below by an affine function, and the restriction to its domain is continuous, ◮ h: RN →] − ∞, +∞[ is β-Lipschitz differentiable , ◮ f is coercive.
  9. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 6/44 Forward-Backward algorithm Let x0 ∈ dom g. Let, for every k ∈ N, γk ∈ ]0, +∞[. For k = 0, 1, . . . xk+1 ∈ proxγkg (xk − γk ∇h(xk )) Let g: RN →] − ∞, +∞] be proper, lsc, and bounded from below by an affine function. The proximity operator of g at x ∈ RN is defined by prox g (x) = Argmin y∈RN g(y) + 1 2 y − x 2.
  10. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 6/44 Forward-Backward algorithm Let x0 ∈ dom g. Let, for every k ∈ N, γk ∈ ]0, +∞[. For k = 0, 1, . . . xk+1 ∈ proxγkg (xk − γk ∇h(xk )) Let g: RN →] − ∞, +∞] be proper, lsc, and bounded from below by an affine function. The proximity operator of g at x ∈ RN is defined by prox g (x) = Argmin y∈RN g(y) + 1 2 y − x 2. ⋆ When g is convex , then prox g (x) is reduced to a singleton.
  11. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 6/44 Forward-Backward algorithm Let x0 ∈ dom g. Let, for every k ∈ N, γk ∈ ]0, +∞[. For k = 0, 1, . . . xk+1 ∈ proxγkg (xk − γk ∇h(xk )) Let g: RN →] − ∞, +∞] be proper, lsc, and bounded from below by an affine function. The proximity operator of g at x ∈ RN is defined by prox g (x) = Argmin y∈RN g(y) + 1 2 y − x 2. ⋆ When g is convex , then prox g (x) is reduced to a singleton. ⋆ When g = ιC is the indicator function of the non empty closed convex set C ⊂ RN , then proxιC (x) = ΠC (x) = argmin y∈C y − x 2.
  12. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 6/44 Forward-Backward algorithm Let x0 ∈ dom g. Let, for every k ∈ N, γk ∈ ]0, +∞[. For k = 0, 1, . . . xk+1 ∈ proxγkg (xk − γk ∇h(xk )) Let g: RN →] − ∞, +∞] be proper, lsc, and bounded from below by an affine function. Let U ∈ RN×N be a symmetric positive definite (SPD) matrix. The proximity operator of g at x ∈ RN is defined by prox U,g (x) = Argmin y∈RN g(y) + 1 2 y − x 2 U , where x 2 U = x | Ux . ⋆ When g is convex , then prox g (x) is reduced to a singleton. ⋆ When g = ιC is the indicator function of the non empty closed convex set C ⊂ RN , then prox ιC (x) = ΠC (x) = argmin y∈C y − x 2.
  13. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 6/44 Forward-Backward algorithm Let x0 ∈ dom g. Let, for every k ∈ N, γk ∈ ]0, +∞[. For k = 0, 1, . . . xk+1 ∈ proxγkg (xk − γk ∇h(xk )) Convergence results: ⋆ Convergence of (xk )k∈N to a minimizer of f is ensured when h and g are convex, and 0 < inf k∈N γk sup k∈N γk < 2β−1. [Combettes & Wajs – 2005]
  14. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 6/44 Forward-Backward algorithm Let x0 ∈ dom g. Let, for every k ∈ N, γk ∈ ]0, +∞[. For k = 0, 1, . . . xk+1 ∈ proxγkg (xk − γk ∇h(xk )) Convergence results: ⋆ Convergence of (xk )k∈N to a minimizer of f is ensured when h and g are convex, and 0 < inf k∈N γk sup k∈N γk < 2β−1. [Combettes & Wajs – 2005] ⋆ Convergence of (xk )k∈N to a critical point of f is ensured when h and/or g are nonconvex, and 0 < inf k∈N γk sup k∈N γk < β−1. [Attouch, Bolte & Svaiter – 2011] Proof based on Kurdyka-Lojasiewicz inequality
  15. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 7/44 Kurdyka-Lojasiewizc inequality Function f satisfies the Kurdyka-Lojasiewicz inequality i.e., for every ξ ∈ R, and, for every bounded subset E of RN , there exist three constants κ > 0, ζ > 0 and θ ∈ [0, 1) such that ∀t ∈ ∂f(x) t κ|f(x) − ξ|θ, for every x ∈ E such that |f(x) − ξ| ζ. ⋆ Note that other forms of the KL inequality can be found in the literature [Bolte et al. - 2007][Bolte et al. - 2010]. ⋆ Satisfied for a wide class of functions : • real analytic functions • semi-algebraic functions • ...
  16. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 8/44 How to make the Forward-Backward algorithm efficient for large scale optimization in the nonconvex case? ? ⋆ Variable metric strategy
  17. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 9/44 Variable metric forward-backward algorithm ⋆ Introduce preconditioning symmetric positive definite (SDP) matrices. Let x0 ∈ dom g. Let, for every k ∈ N, γk ∈ ]0, +∞[ and Ak (xk ) ∈ RN×N an SPD matrix. For k = 0, 1, . . . xk+1 ∈ prox γ−1 k Ak(xk), g xk − γkAk (xk ) −1∇h(xk )
  18. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 9/44 Variable metric forward-backward algorithm Let x0 ∈ dom g. Let, for every k ∈ N, γk ∈ ]0, +∞[ and Ak (xk ) ∈ RN×N an SPD matrix. For k = 0, 1, . . . xk+1 ∈ prox γ−1 k Ak(xk), g xk − γkAk (xk ) −1∇h(xk ) Convergence results: ⋆ Convergence of (xk )k∈N to a minimizer of f when h and g are convex [Combettes & V˜ u - 2014] No automatic method for the choice of the preconditioning matrices
  19. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 9/44 Variable metric forward-backward algorithm Let x0 ∈ dom g. Let, for every k ∈ N, γk ∈ ]0, +∞[ and Ak (xk ) ∈ RN×N an SPD matrix. For k = 0, 1, . . . xk+1 ∈ prox γ−1 k Ak(xk), g xk − γkAk (xk ) −1∇h(xk ) Convergence results: ⋆ Convergence of (xk )k∈N to a minimizer of f when h and g are convex [Combettes & V˜ u - 2014] No automatic method for the choice of the preconditioning matrices ⋆ Convergence of (xk )k∈N to a critical point of f when h and/or g are nonconvex [Chouzenoux et al. - 2014] Automatic method for the choice of the preconditioning matrices
  20. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 10/44 Majorize-Minimize strategy [Jacobson & Fessler – 2007] For every k ∈ N, there exists an SPD matrix Ak (xk ) ∈ RN×N such that (∀x ∈ RN ) q(x, xk ) = h(xk ) + x − xk | ∇h(xk ) + 1 2 x − xk 2 Ak (xk ) is a majorant function of h at xk on dom g, i.e., h(xk ) = q(xk , xk ) and (∀x ∈ dom g) h(x) q(x, xk ). h xk q(·, xk )
  21. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 10/44 Majorize-Minimize strategy [Jacobson & Fessler – 2007] For every k ∈ N, there exists an SPD matrix Ak (xk ) ∈ RN×N such that (∀x ∈ RN ) q(x, xk ) = h(xk ) + x − xk | ∇h(xk ) + 1 2 x − xk 2 Ak (xk ) is a majorant function of h at xk on dom g, i.e., h(xk ) = q(xk , xk ) and (∀x ∈ dom g) h(x) q(x, xk ). h is differentiable with a β-Lipschitzian gradient on a convex subset of RN Ak (xk ) ≡ β IN satisfies the majorization condition [Bertsekas - 1999]
  22. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 11/44 How to make the Forward-Backward algorithm efficient for large scale optimization? ? ⋆ Variable metric strategy ⋆ Block-coordinate method
  23. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 12/44 Block separable structure ◮ g is an additively block separable function.
  24. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 12/44 Block separable structure ◮ g is an additively block separable function. x ∈ RN x(1) ∈ RN1 x(2) ∈ RN2 · · · · x(J) ∈ RNJ N = J j=1 Nj (∀˙  ∈ {1, . . . , J}) g˙  : RN˙  →] − ∞, +∞] is a proper, lsc function, continuous on its domain and bounded from below by an affine function. g
  25. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 12/44 Block separable structure ◮ g is an additively block separable function. x x(1) x(2) · · · · x(J) = J j=1 gj(x(j)) (∀˙  ∈ {1, . . . , J}) g˙  : RN˙  →] − ∞, +∞] is a proper, lsc function, continuous on its domain and bounded from below by an affine function. g = g
  26. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 13/44 Block coordinate approach Find x ∈ Argmin x∈RN f(x) = h(x) + J j=1 gj(x(j)) Optimization problem ⋆ Principle At each iteration k ∈ N, update only a subset of components (∼ Gauss-Seidel methods) ⋆ Advantages • more flexibility, • reduce computational cost at each iteration, • reduce memory requirement.
  27. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 14/44 Block coordinate VMFB algorithm Let x0 ∈ dom g. For k = 0, 1, . . .      Let ˙ k ∈ {1, . . . , J}, A˙ k (xk ) ∈ RN˙ k ×N˙ k and γk ∈ ]0, +∞[ . x( ˙ k ) k+1 ∈ prox γ−1 k A˙ k (xk ), g˙ k x( ˙ k ) k − γk A˙ k (xk ) −1∇˙ k h(xk ) x(k ) k+1 = x(k ) k where (∀k ∈ N) x(k ) k = x(1), . . . , x(k −1), x(k +1), . . . , x(J) .
  28. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 14/44 Block coordinate VMFB algorithm Let x0 ∈ dom g. For k = 0, 1, . . .      Let ˙ k ∈ {1, . . . , J}, A˙ k (xk ) ∈ RN˙ k ×N˙ k and γk ∈ ]0, +∞[ . x( ˙ k ) k+1 ∈ prox γ−1 k A˙ k (xk ), g˙ k x( ˙ k ) k − γk A˙ k (xk ) −1∇˙ k h(xk ) x(k ) k+1 = x(k ) k Convergence results: ⋆ Blocks updated according to a random rule in the convex case when A˙ k (xk ) ≡ IN˙ k [Combettes & Pesquet – 2014]
  29. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 14/44 Block coordinate VMFB algorithm Let x0 ∈ dom g. For k = 0, 1, . . .      Let ˙ k ∈ {1, . . . , J}, A˙ k (xk ) ∈ RN˙ k ×N˙ k and γk ∈ ]0, +∞[ . x( ˙ k ) k+1 ∈ prox γ−1 k A˙ k (xk ), g˙ k x( ˙ k ) k − γk A˙ k (xk ) −1∇˙ k h(xk ) x(k ) k+1 = x(k ) k Convergence results: ⋆ Blocks updated according to a random rule in the convex case when A˙ k (xk ) ≡ IN˙ k [Combettes & Pesquet – 2014] ⋆ Blocks updated according to a cyclic rule in the nonconvex case when A˙ k (xk ) ≡ IN˙ k [Bolte et al. – 2013] and when A˙ k (xk ) is a general SPD matrix [Frankel et al. – 2014]
  30. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 14/44 Block coordinate VMFB algorithm Let x0 ∈ dom g. For k = 0, 1, . . .      Let ˙ k ∈ {1, . . . , J}, A˙ k (xk ) ∈ RN˙ k ×N˙ k and γk ∈ ]0, +∞[ . x( ˙ k ) k+1 ∈ prox γ−1 k A˙ k (xk ), g˙ k x( ˙ k ) k − γk A˙ k (xk ) −1∇˙ k h(xk ) x(k ) k+1 = x(k ) k Convergence results: ⋆ Blocks updated according to a random rule in the convex case when A˙ k (xk ) ≡ IN˙ k [Combettes & Pesquet – 2014] ⋆ Blocks updated according to a cyclic rule in the nonconvex case when A˙ k (xk ) ≡ IN˙ k [Bolte et al. – 2013] and when A˙ k (xk ) is a general SPD matrix [Frankel et al. – 2014] ⋆ Blocks updated according to a quasi-cyclic rule in the nonconvex case when A˙ k (xk ) is a general SPD matrix [Chouzenoux et al. – 2014]
  31. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 14/44 Block coordinate VMFB algorithm Let x0 ∈ dom g. For k = 0, 1, . . .      Let ˙ k ∈ {1, . . . , J}, A˙ k (xk ) ∈ RN˙ k ×N˙ k and γk ∈ ]0, +∞[ . x( ˙ k ) k+1 ∈ prox γ−1 k A˙ k (xk ), g˙ k x( ˙ k ) k − γk A˙ k (xk ) −1∇˙ k h(xk ) x(k ) k+1 = x(k ) k ⋆ Cyclic rule: update sequentially blocks 1, 2, . . . , J. ⋆ Quasi-cyclic rule: there exists K J such that, for every k ∈ N, {1, . . . , J} ⊂ {˙ k , . . . , ˙ k+K−1 }.
  32. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 14/44 Block coordinate VMFB algorithm Let x0 ∈ dom g. For k = 0, 1, . . .      Let ˙ k ∈ {1, . . . , J}, A˙ k (xk ) ∈ RN˙ k ×N˙ k and γk ∈ ]0, +∞[ . x( ˙ k ) k+1 ∈ prox γ−1 k A˙ k (xk ), g˙ k x( ˙ k ) k − γk A˙ k (xk ) −1∇˙ k h(xk ) x(k ) k+1 = x(k ) k ⋆ Cyclic rule: update sequentially blocks 1, 2, . . . , J. ⋆ Quasi-cyclic rule: there exists K J such that, for every k ∈ N, {1, . . . , J} ⊂ {˙ k , . . . , ˙ k+K−1 }. Example: J = 3 blocks denoted {1, 2, 3} • K = 3: • cyclic updating order: {1, 2, 3, 1, 2, 3, . . .} • example of quasi-cyclic updating order: {1, 3, 2, 2, 1, 3, . . .}
  33. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 14/44 Block coordinate VMFB algorithm Let x0 ∈ dom g. For k = 0, 1, . . .      Let ˙ k ∈ {1, . . . , J}, A˙ k (xk ) ∈ RN˙ k ×N˙ k and γk ∈ ]0, +∞[ . x( ˙ k ) k+1 ∈ prox γ−1 k A˙ k (xk ), g˙ k x( ˙ k ) k − γk A˙ k (xk ) −1∇˙ k h(xk ) x(k ) k+1 = x(k ) k ⋆ Cyclic rule: update sequentially blocks 1, 2, . . . , J. ⋆ Quasi-cyclic rule: there exists K J such that, for every k ∈ N, {1, . . . , J} ⊂ {˙ k , . . . , ˙ k+K−1 }. Example: J = 3 blocks denoted {1, 2, 3} • K = 3: • cyclic updating order: {1, 2, 3, 1, 2, 3, . . .} • example of quasi-cyclic updating order: {1, 3, 2, 2, 1, 3, . . .} • K = 4: possibility to update some blocks more than once every K iteration • {1, 3, 2, 2, 2, 2, 1, 3, . . .}
  34. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 15/44 Nonconvex BC-VMFB algorithm: Convergence results ◮ ∃(ν, ν) ∈ ]0, +∞[2 such that (∀k ∈ N) νIN Ak (xk ) νIN and (∀˙ k ∈ {1, . . . , J}) A˙ k (xk ) satisfies the MM assumption at x( ˙ k) k for the restriction of h to the block ˙ k : y ∈ RN˙ k → h x(1) k , . . . , x( ˙ k −1) k , y, x( ˙ k+1) k , . . . , x(J) k . ◮ Blocks (˙ k )k∈N updated according to a quasi-cyclic rule . ◮ f satisfies the KL inequality . ◮ The step-size is chosen such that: • ∃(γ, γ) ∈ ]0, +∞[2 such that (∀k ∈ N) γ γk 1 − γ. • g is convex and ∃(γ, γ) ∈ ]0, +∞[2 such that (∀k ∈ N) γ γk 2 − γ.
  35. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 15/44 Nonconvex BC-VMFB algorithm: Convergence results ◮ ∃(ν, ν) ∈ ]0, +∞[2 such that (∀k ∈ N) νIN Ak (xk ) νIN and (∀˙ k ∈ {1, . . . , J}) A˙ k (xk ) satisfies the MM assumption at x( ˙ k) k . ◮ Blocks (˙ k )k∈N updated according to a quasi-cyclic rule . ◮ f satisfies the KL inequality . ◮ The step-size is chosen such that: • ∃(γ, γ) ∈ ]0, +∞[2 such that (∀k ∈ N) γ γk 1 − γ. • g is convex and ∃(γ, γ) ∈ ]0, +∞[2 such that (∀k ∈ N) γ γk 2 − γ. ◮ Global convergence ⋆ (xk )k∈N converges to a critical point x of f. ⋆ (f(xk ))k∈N is a nonincreasing sequence converging to f(x). ◮ Local convergence If (∃υ > 0) such that f(x0 ) inf x∈RN f(x) + υ, then (xk )k∈N converges to a solution x to the minimization problem.
  36. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 16/44 Illustrating examples ⋆ Phase retrieval problem
  37. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 17/44 Phase retrieval problem z = H ∈ CS×M y ∈ RM + w ∈ [0, +∞[S Complex observation measurements Observation matrix: H ∈ CS×M is the composition of ⋆ a matrix modeling S = 23400 Radon projections from • 128 parallel acquisition lines • 180 angles regularly distributed on [0, π[ ⋆ a complex-valued blur operator Approximate version of the tomography model proposed in [Davidoiu et al. – 2012]
  38. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 17/44 Phase retrieval problem z = H ∈ CS×2M y y = (yR , yI ) ∈ R2M + w ∈ [0, +∞[S Complex observation measurements Original image is complex valued? yR ∈ RM real part yI ∈ RM imaginary part
  39. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 17/44 Phase retrieval problem z = H W ∈ R2M×(4M+4M) x x = (xR , xI ) ∈ R4M+4M + w Synthesis approach xR (resp. xI ) is an overcomplete Haar decomposition of yR (resp. yI ). yR xR
  40. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 18/44 Proposed Criterion Observation model: z = |HWx| + w Objective: Find an estimate x of x from z Deduce an estimation y = Wx of the original image y
  41. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 18/44 Proposed Criterion Observation model: z = |HWx| + w minimize x∈RN f(x) = h(x) + g(x) ⋆ h(x) = S s=1 ϕs [HWx](s) where (∀s ∈ {1, . . . , S})(∀u ∈ C) ϕs(u) = 1 2 (z(s))2 + |u|2 − z(s)(|u|2 + δ2)1/2 with δ > 0. • Smooth approximation of the usual least squares criterion.
  42. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 18/44 Proposed Criterion Observation model: z = |HWx| + w minimize x∈RN f(x) = h(x) + g(x) ⋆ h(x) = S s=1 ϕs [HWx](s) where (∀s ∈ {1, . . . , S})(∀u ∈ C) ϕs(u) = 1 2 (z(s))2 + |u|2 :=ϕs,1(|u|) convex −z(s) |u|2 + δ2 1/2 :=ϕs,2(|u|) concave with δ > 0. • Smooth approximation of the usual least squares criterion.
  43. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 18/44 Proposed Criterion Observation model: z = |HWx| + w minimize x∈RN f(x) = h(x) + g(x) ⋆ h(x) = S s=1 ϕs [HWx](s) where (∀s ∈ {1, . . . , S})(∀u ∈ C) ϕs(u) = 1 2 (z(s))2 + |u|2 :=ϕs,1(|u|) convex −z(s) |u|2 + δ2 1/2 :=ϕs,2(|u|) concave with δ > 0. ⋆ g(x) = 4M p=1 gp(x(p) R , x(p) I ) where (∀u ∈ R2) gp(u) =      ι{(0,0)} (u) if p ∈ E ϑp u if p ∈ E ∪ {M + 1, . . . , 4M} ϑp u − ω(p) 2 if p ∈ E ∪ {1, . . . , M} with ϑp ∈ ]0, +∞[ and ω(p) ∈ R2.
  44. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 19/44 Algorithm implementation ⋆ Choice of the preconditioning matrices: Let (∀x ∈ RN ) h(x) = h1 (x) + h2 (x) such that • h1 (x) = S s=1 ϕs,1 [HWx](s) is majorized at x ∈ RN by q1 (x, x) = h1 (x) + x − x | ∇h1 (x) + 1 2 x − x 2 B where B is a diagonal matrix chosen using the Jensen inequality. • h2 (x) = − S s=1 ϕs,1 [HWx](s) is majorized at x ∈ RN by its tangent function.
  45. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 19/44 Algorithm implementation ⋆ Choice of the preconditioning matrices: Let (∀x ∈ RN ) h(x) = h1 (x) + h2 (x) such that • h1 (x) = S s=1 ϕs,1 [HWx](s) is majorized at x ∈ RN by q1 (x, x) = h1 (x) + x − x | ∇h1 (x) + 1 2 x − x 2 B where B is a diagonal matrix chosen using the Jensen inequality. • h2 (x) = − S s=1 ϕs,1 [HWx](s) is majorized at x ∈ RN by its tangent function. ⋆ Choice of the blocks: Haar decomposition Block indices
  46. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 20/44 Numerical results Reconstruction of the real part • Original unknown image yR Reconstructed image obtained with • our method SNR = 21.27 dB. • an ℓ0 regularized version of the Fienup algorithm SNR = 14.45 dB. [Mukherjee et al., 2012]
  47. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 20/44 Numerical results Reconstruction of the imaginary part • Original unknown image yI Reconstructed image obtained with • our method SNR = 21.27 dB. • an ℓ0 regularized version of the Fienup algorithm SNR = 14.45 dB. [Mukherjee et al., 2012]
  48. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 20/44 Numerical results ◮ Block Coordinate Variable Metric algorithm ◮ Proximal Alternating Linearized Minimization (PALM) [Bolte et al. – 2014] Non preconditioned version of BCVMFB 0 600 1200 1800 2400 3000 3600 10−2 10−1 100 Time (s) xk − x / x
  49. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 21/44 Illustrating examples ⋆ Phase retrieval problem ⋆ Seismic blind deconvolution problem [Collaboration with L. Duval from IFPEN and M. Q. Pham]
  50. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 22/44 Seismic blind deconvolution problem 1 100 200 300 400 500 600 700 −0.8 −0.4 0 0.4 y = 1 100 200 300 400 500 600 700 −0.8 −0.4 0 0.4 k ∗ s + 1 100 200 300 400 500 600 700 −0.2 −0.1 0 0.1 0.2 w where ⋆ y ∈ RN1 observed signal (N1 = 784) ⋆ s ∈ RN1 unknown sparse original seismic signal ⋆ k ∈ RN2 unknown original blur kernel (N2 = 41) ⋆ w ∈ RN1 additive noise: realization of a zero-mean white Gaussian noise with variance σ2
  51. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 23/44 Sparsity in blind deconvolution problems Problem: Choose a sparsity measure leading to a good estimation of s.
  52. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 23/44 Sparsity in blind deconvolution problems Problem: Choose a sparsity measure leading to a good estimation of s. ℓ0 • Nonsmooth and nonconvex • Difficult to manage
  53. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 23/44 Sparsity in blind deconvolution problems Problem: Choose a sparsity measure leading to a good estimation of s. ℓ0 ℓ1 • Nonsmooth and convex • Do not lead to a good estimation of s in the context of blind deconvolution problems [Benichoux et al. – 2013]
  54. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 23/44 Sparsity in blind deconvolution problems Problem: Choose a sparsity measure leading to a good estimation of s. ℓ0 ℓ1 ℓ1/ℓ2 • Nonsmooth and nonconvex • Efficient in the context of blind deconvolution problems [Benichoux et al. – 2013] • Difficult to manage
  55. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 23/44 Sparsity in blind deconvolution problems Problem: Choose a sparsity measure leading to a good estimation of s. ℓ0 ℓ1 ℓ1/ℓ2 ℓ1,α/ℓ2,η Solution: Use a smooth approximation of the ℓ1 /ℓ2 penalization function. • (∀s ∈ RN ) ℓ1,α (s) = N1 n=1 (s(n))2 + α2 − α , where α ∈ ]0, +∞[ • (∀s ∈ RN ) ℓ2,η (s) = N1 n=1 (s(n))2 + η2, where η ∈ ]0, +∞[
  56. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 23/44 Sparsity in blind deconvolution problems Problem: Choose a sparsity measure leading to a good estimation of s. ℓ0 ℓ1 ℓ1/ℓ2 ℓ1,α/ℓ2,η log ℓ1,α+β ℓ2,η Solution: Use a smooth approximation of the ℓ1 /ℓ2 penalization function. • (∀s ∈ RN ) ℓ1,α (s) = N1 n=1 (s(n))2 + α2 − α , where α ∈ ]0, +∞[ • (∀s ∈ RN ) ℓ2,η (s) = N1 n=1 (s(n))2 + η2, where η ∈ ]0, +∞[
  57. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 24/44 Proposed criterion Observation model: y = k ∗ s + w minimize s∈RN1 ,k∈RN2 (f(s, k) = h(s, k) + g1(s) + g2(k)) ⋆ h(s, k) = 1 2 k ∗ s − y 2 data fitelity term + λ log ℓ1,α(s) + β ℓ2,η(s) smooth regularization term for (α, β, η, λ) ∈]0, +∞[4. ⋆ g1(s) = ι[smin,smax]N1 (s), with (smin, smax ) ∈]0, +∞[2. ⋆ g2(k) = ιC (k), with C = {k ∈ [kmin, kmax ]N2 | k δ}, for (kmin, kmax, δ) ∈]0, +∞[3.
  58. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 25/44 Numerical results 1 100 200 300 400 500 600 700 −0.8 −0.4 0 0.4 ⋆ Continuous red line: s ⋆ Dashed black line: s 1 10 20 30 40 −0.5 0 0.5 1 ⋆ Continuous red line: k ⋆ Dashed black line: k 0 100 200 300 400 500 15 30 60 120 180 Ks Time (s.) Ks: number of iterations on s for one iteration on k
  59. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 26/44 How to make the Forward-Backward algorithm efficient for large scale optimization in the convex case? ? ⋆ Introduce stochasticity
  60. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 27/44 Minimization problem Let F be the set of solutions to the problem minimize x(1)∈H1,...,x(p)∈Hp p j=1 fj(x(j)) + q l=1 gl p j=1 Ll,j x(j) Optimization problem (∀j ∈ {1, . . ., p})(∀l ∈ {1, . . . , q}) ◮ Hj and Gl real separable Hilbert spaces ◮ fj ∈ Γ0 (Hj ) ◮ gl : Gl → R convex, µ−1 l -Lipschitz differentiable, with µl ∈ ]0, +∞[ ◮ Ll,j : Hj → Gl is linear and bounded We assume that F = ∅ and min1 l p p j=1 Ll,j 2 > 0.
  61. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 28/44 Random block-coordinate forward-backward splitting [Combettes & Pesquet – 2015] for k = 0, 1, . . .       for j = 1, . . . , p     y(j) k = ε(j) k x(j) k − γk q l=1 L∗ kl,j ∇gl ( p j=1 Ll,j x(j) k ) + b(j) k x(j) k+1 = x(j) k + ε(j) k λk proxγkfj (y(j) k ) + a(j) k − x(j) k
  62. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 28/44 Random block-coordinate forward-backward splitting [Combettes & Pesquet – 2015] for k = 0, 1, . . .       for j = 1, . . . , p     y(j) k = ε(j) k x(j) k − γk q l=1 L∗ kl,j ∇gl ( p j=1 Ll,j x(j) k ) + b(j) k x(j) k+1 = x(j) k + ε(j) k λk proxγkfj (y(j) k ) + a(j) k − x(j) k where ◮ (εk )k∈N identically distributed D-valued random variables with D = {0, 1}p+q {0} binary variables signaling the blocks to be activated
  63. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 28/44 Random block-coordinate forward-backward splitting [Combettes & Pesquet – 2015] for k = 0, 1, . . .       for j = 1, . . . , p     y(j) k = ε(j) k x(j) k − γk q l=1 L∗ kl,j ∇gl ( p j=1 Ll,j x(j) k ) + b(j) k x(j) k+1 = x(j) k + ε(j) k λk proxγkfj (y(j) k ) + a(j) k − x(j) k where ◮ (εk )k∈N binary variables signaling the blocks to be activated ◮ x0 , (ak )k∈N , and (bk )k∈N H-valued random variables, with H = H1 × . . . × Hp (ak )k∈N and (bk )k∈N : error terms
  64. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 28/44 Random block-coordinate forward-backward splitting [Combettes & Pesquet – 2015] for k = 0, 1, . . .       for j = 1, . . . , p     y(j) k = ε(j) k x(j) k − γk q l=1 L∗ kl,j ∇gl ( p j=1 Ll,j x(j) k ) + b(j) k x(j) k+1 = x(j) k + ε(j) k λk proxγkfj (y(j) k ) + a(j) k − x(j) k where ◮ (εk )k∈N binary variables signaling the blocks to be activated ◮ (ak )k∈N and (bk )k∈N error terms ◮ (γk )k∈N sequence in ]0, 2ϑ[ such that infk∈N γk > 0 and sup k∈N γk < 2ϑ with ϑ =   q l=1 µ−1 l p j=1 Ll,j L∗ l,j   −1 stepsize
  65. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 28/44 Random block-coordinate forward-backward splitting [Combettes & Pesquet – 2015] for k = 0, 1, . . .       for j = 1, . . . , p     y(j) k = ε(j) k x(j) k − γk q l=1 L∗ kl,j ∇gl ( p j=1 Ll,j x(j) k ) + b(j) k x(j) k+1 = x(j) k + ε(j) k λk proxγkfj (y(j) k ) + a(j) k − x(j) k where ◮ (εk )k∈N binary variables signaling the blocks to be activated ◮ (ak )k∈N and (bk )k∈N error terms ◮ (γk )k∈N stepsize ◮ (∀k ∈ N) λk ∈ ]0, 1] such that infk∈N λk > 0
  66. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 28/44 Random block-coordinate forward-backward splitting [Combettes & Pesquet – 2015] for k = 0, 1, . . .       for j = 1, . . . , p     y(j) k = ε(j) k x(j) k − γk q l=1 L∗ kl,j ∇gl ( p j=1 Ll,j x(j) k ) + b(j) k x(j) k+1 = x(j) k + ε(j) k λk prox γkfj (y(j) k ) + a(j) k − x(j) k Let (Ω, F, P) be the underlying probability space. Set (∀k ∈ N) Xk = σ(xk′ )0 k′ k . Assume that ◮ k∈N E( ak 2 |Xk ) < +∞ and nk∈N E( bk 2 |Xk ) < +∞. ◮ For every k ∈ N εk and Xk are independent. ◮ εk are identically distributed such that (∀j ∈ {1, . . . , p}) P[ε(j) k = 1] > 0. ◮ (xk )k∈N converges weakly a.s. to an F-valued random variable. Proof: Based on properties of quasi-Fej´ er stochastic sequences
  67. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 29/44 How to make the Forward-Backward algorithm efficient for large scale optimization in the convex case? ? ⋆ Introduce stochasticity ⋆ Use a primal-dual approach
  68. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 30/44 Primal-dual problem Let F be the set of solutions to the problem minimize x(1)∈H1,...,x(p)∈Hp p j=1 fj (x(j)) + hj (x(j)) + q l=1 gl ll p j=1 Ll,j x(j) Primal problem (∀j ∈ {1, . . . , p})(∀l ∈ {1, . . . , q}) ◮ fj ∈ Γ0 (Hj ) and gl ∈ Γ0 (Gl ) ◮ hj : Hj → R convex, µ−1 j -Lipschitz differentiable, with µj ∈ ]0, +∞[ ◮ ll ∈ Γ0 (Gl ) ν−1 l -strongly convex, with νl ∈ ]0, +∞[ ◮ Ll,j : Hj → Gl is linear and bounded ◮ Ll = j ∈ {1, . . . , p} Ll,j = 0 = ∅, and L∗ j = l ∈ {1, . . ., q} Ll,j = 0 = ∅. Inf-convolution: gl ll : H → [−∞ + ∞]: u → infv∈Gl gl (v) + ll (u − v) ⋆ If ll = ι{0} , then gl ι{0} = gl
  69. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 30/44 Primal-dual problem Let F be the set of solutions to the problem minimize x(1)∈H1,...,x(p)∈Hp p j=1 fj (x(j)) + hj (x(j)) + q l=1 gl ll p j=1 Ll,j x(j) Primal problem Let F∗ be the set of solutions to the problem minimize v(1)∈G1,...,v(q)∈Gq p j=1 (f∗ j h∗ j ) − q l=1 L∗ l,j v(l) + q l=1 g∗ l (v(l)) + l∗ l (v(l)) Dual problem Conjugate function: f∗ j : H → [−∞, +∞]: u ∈ H → sup x∈Hj x | u − fj (x) ◮ Assume that there exists (x(1), . . . , x(p)) ∈ H1 × . . . × Hp such that (∀j ∈ {1, . . . , p}) 0 ∈ ∂fj (x(j)) + ∇hj (x(j)) + q l=1 L∗ l,j (∂gl ∂ll ) Ll,j x(j) . Objective: Find an F × F∗-valued random variable (x, v).
  70. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 31/44 Random block-coordinate proximal primal-dual algorithm For k = 0, 1, . . .                for l = 1, . . . , q     u(l) k = ε(p+l) k prox Ul,g∗ l v(l) k + Ul ( j∈Ll Ll,j x(j) k − ∇l∗ l (v(l) k ) + d(l) k ) + b(l) k v(l) k+1 = v(l) k + λk ε(p+l) k (u(l) k − v(l) k ), for j = 1, . . . , p      y(j) k = ε(j) k prox Wj ,fj x(j) k − Wj ( l∈L∗ j L∗ l,j (2u(l) k − v(l) k ) + ∇hj (x(j) k ) + c(j) k ) + a(j) k x(j) k+1 = x(j) k + λk ε(j) k (y(j) k − x(j) k )
  71. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 31/44 Random block-coordinate proximal primal-dual algorithm For k = 0, 1, . . .                for l = 1, . . . , q     u(l) k = ε(p+l) k prox Ul,g∗ l v(l) k + Ul ( j∈Ll Ll,j x(j) k − ∇l∗ l (v(l) k ) + d(l) k ) + b(l) k v(l) k+1 = v(l) k + λk ε(p+l) k (u(l) k − v(l) k ), for j = 1, . . . , p      y(j) k = ε(j) k prox Wj ,fj x(j) k − Wj ( l∈L∗ j L∗ l,j (2u(l) k − v(l) k ) + ∇hj (x(j) k ) + c(j) k ) + a(j) k x(j) k+1 = x(j) k + λk ε(j) k (y(j) k − x(j) k ) where ◮ (εk )k∈N identically distributed D-valued random variables with D = {0, 1}p+q {0} binary variables signaling the blocks to be activated
  72. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 31/44 Random block-coordinate proximal primal-dual algorithm For k = 0, 1, . . .                for l = 1, . . . , q     u(l) k = ε(p+l) k prox Ul,g∗ l v(l) k + Ul ( j∈Ll Ll,j x(j) k − ∇l∗ l (v(l) k ) + d(l) k ) + b(l) k v(l) k+1 = v(l) k + λk ε(p+l) k (u(l) k − v(l) k ), for j = 1, . . . , p      y(j) k = ε(j) k prox Wj ,fj x(j) k − Wj ( l∈L∗ j L∗ l,j (2u(l) k − v(l) k ) + ∇hj (x(j) k ) + c(j) k ) + a(j) k x(j) k+1 = x(j) k + λk ε(j) k (y(j) k − x(j) k ) where ◮ (εk )k∈N binary variables signaling the blocks to be activated ◮ x0 , (ak )k∈N , and (ck )k∈N H-valued random variables, v0 , (bk )k∈N , and (dk )k∈N G-valued random variables with H = H1 × . . . × Hp and G = G1 × · · · × Gq (ak )k∈N , (bk )k∈N , (ck )k∈N , and (dk )k∈N : error terms
  73. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 31/44 Random block-coordinate proximal primal-dual algorithm For k = 0, 1, . . .                for l = 1, . . . , q     u(l) k = ε(p+l) k prox Ul,g∗ l v(l) k + Ul ( j∈Ll Ll,j x(j) k − ∇l∗ l (v(l) k ) + d(l) k ) + b(l) k v(l) k+1 = v(l) k + λk ε(p+l) k (u(l) k − v(l) k ), for j = 1, . . . , p      y(j) k = ε(j) k prox Wj ,fj x(j) k − Wj ( l∈L∗ j L∗ l,j (2u(l) k − v(l) k ) + ∇hj (x(j) k ) + c(j) k ) + a(j) k x(j) k+1 = x(j) k + λk ε(j) k (y(j) k − x(j) k ) where ◮ (εk )k∈N binary variables signaling the blocks to be activated ◮ (ak )k∈N , (bk )k∈N , (ck )k∈N , and (dk )k∈N : error terms ◮ (∀j ∈ {1, . . . , p}) Wj : Hj → Hj and (∀k ∈ {1, . . . , q}) Ul : Gk → Gk strongly positive self-adjoint preconditioning linear operators such that 1− p j=1 q k=1 U1/2 l Ll,j W1/2 j 2 1/2 > 1 2 max{( Wj µj )1 j p , ( Ul νl )1 k q }.
  74. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 31/44 Random block-coordinate proximal primal-dual algorithm For k = 0, 1, . . .                for l = 1, . . . , q     u(l) k = ε(p+l) k prox Ul,g∗ l v(l) k + Ul ( j∈Ll Ll,j x(j) k − ∇l∗ l (v(l) k ) + d(l) k ) + b(l) k v(l) k+1 = v(l) k + λk ε(p+l) k (u(l) k − v(l) k ), for j = 1, . . . , p      y(j) k = ε(j) k prox Wj ,fj x(j) k − Wj ( l∈L∗ j L∗ l,j (2u(l) k − v(l) k ) + ∇hj (x(j) k ) + c(j) k ) + a(j) k x(j) k+1 = x(j) k + λk ε(j) k (y(j) k − x(j) k ) where ◮ (εk )k∈N binary variables signaling the blocks to be activated ◮ (ak )k∈N , (bk )k∈N , (ck )k∈N , and (dk )k∈N : error terms ◮ (∀j ∈ {1, . . . , p}) Wj : Hj → Hj and (∀k ∈ {1, . . . , q}) Ul : Gk → Gk strongly positive self-adjoint preconditioning linear operators ◮ (∀k ∈ N) λk ∈ ]0, 1] such that infk∈N λk > 0.
  75. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 31/44 Random block-coordinate proximal primal-dual algorithm For k = 0, 1, . . .            for l = 1, . . . , q u(l) k = ε(p+l) k prox Ul,g∗ l v(l) k + Ul (Ll xk − ∇l∗ l (v(l) k ) + d(l) k ) + b(l) k v(l) k+1 = v(l) k + λk ε(p+l) k (u(l) k − v(l) k ), yk = prox W,f xk − W( l∈L∗ L∗ l (2uk − vk ) + ∇h(xk ) + ck + ak xk+1 = xk + λk (yk − xk ) Remark: ⋆ when p = 1 randomized version of the primal-dual algorithm proposed in [Condat – 2013][V˜ u – 2013] ⋆ when p = 1 and (∀k ∈ N) ε(p+l) k = 1 the primal-dual algorithm proposed in [Condat – 2013][V˜ u – 2013] is recovered
  76. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 32/44 Random block-coordinate primal-dual algorithm Let (Ω, F, P) be the underlying probability space. Set (∀k ∈ N) Xk = σ(xk′ , vk′ )0 k′ k . Assume that ◮ k∈N E( ak 2 |Xk ) < +∞, k∈N E( bk 2 |Xk ) < +∞, k∈N E( ck 2 |Xk ) < +∞, and k∈N E( dk 2 |Xk ) < +∞ a.s. ◮ The variables (εk )k∈N are identically distributed such that (∀j ∈ {1, . . . , p}) P[ε(j) 0 = 1] > 0. ◮ For every k ∈ N, εk and Xk are independent. ◮ For every l ∈ {1, . . . , q} and k ∈ N, j∈Ll ω ∈ Ω ε(j) k (ω) = 1 ⊂ ω ∈ Ω ε(p+l) k (ω) = 1 . ◮ (xk )k∈N converges weakly a.s. to an F-valued random variable. ◮ (vk )k∈N converges weakly a.s. to an F∗-valued random variable.
  77. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 32/44 Random block-coordinate primal-dual algorithm Let (Ω, F, P) be the underlying probability space. Set (∀k ∈ N) Xk = σ(xk′ , vk′ )0 k′ k . Assume that ◮ k∈N E( ak 2 |Xk ) < +∞, k∈N E( bk 2 |Xk ) < +∞, k∈N E( ck 2 |Xk ) < +∞, and k∈N E( dk 2 |Xk ) < +∞ a.s. ◮ The variables (εk )k∈N are identically distributed such that (∀j ∈ {1, . . . , p}) P[ε(j) 0 = 1] > 0. ◮ For every k ∈ N, εk and Xk are independent. ◮ For every l ∈ {1, . . . , q} and k ∈ N, ε(p+l) k = max 1 j p ε(j) k l ∈ L∗ j . ◮ (xk )k∈N converges weakly a.s. to an F-valued random variable. ◮ (vk )k∈N converges weakly a.s. to an F∗-valued random variable.
  78. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 33/44 Random block-coordinate primal-dual algorithm ⋆ In the case when (∀j ∈ {1, . . . , p}) fj ≡ 0 : For k = 0, 1, . . .                         for j = 1, . . . , p         η(j) k = max ε(p+l) k l ∈ L∗ j s(j) k = η(j) k x(j) k − Wj ∇hj(x(j) k ) + a(j) k y(j) k = η(j) k s(j) k − Wj l∈L∗ j L∗ l,j v(l) k for l = 1, . . . , q v(l) k+1 = ε(p+l) k proxU−1 l g∗ k v(l) k + Ul j∈Ll Ll,j y(j) k − Ul ∇l∗ k (v(l) k ) + c(l) k + b(l) k for j = 1, . . . , p x(j) k+1 = ε(j) k s(j) k − Wj l∈L∗ j L∗ l,j v(l) k weak a.s. convergence is ensured under less restrictive conditions on preconditioning operators (Wj )1 j p and (Ul )1 l q .
  79. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 34/44 Illustration of the random sampling strategy Variable selection (∀k ∈ N) x1,k activated when ε1,k = 1 x2,k activated when ε2,k = 1 x3,k activated when ε3,k = 1 x4,k activated when ε4,k = 1 x5,k activated when ε5,k = 1 x6,k activated when ε6,k = 1 How to choose (∀k ∈ N) the variable εk = (ε1,k , . . . , ε6,k )?
  80. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 34/44 Illustration of the random sampling strategy Variable selection (∀k ∈ N) x1,k activated when ε1,k = 1 x2,k activated when ε2,k = 1 x3,k activated when ε3,k = 1 x4,k activated when ε4,k = 1 x5,k activated when ε5,k = 1 x6,k activated when ε6,k = 1 How to choose (∀k ∈ N) the variable εk = (ε1,k , . . . , ε6,k )? P[εk = (1, 1, 0, 0, 0, 0)] = 0.1
  81. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 34/44 Illustration of the random sampling strategy Variable selection (∀k ∈ N) x1,k activated when ε1,k = 1 x2,k activated when ε2,k = 1 x3,k activated when ε3,k = 1 x4,k activated when ε4,k = 1 x5,k activated when ε5,k = 1 x6,k activated when ε6,k = 1 How to choose (∀k ∈ N) the variable εk = (ε1,k , . . . , ε6,k )? P[εk = (1, 1, 0, 0, 0, 0)] = 0.1 P[εk = (1, 0, 1, 0, 0, 0)] = 0.2
  82. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 34/44 Illustration of the random sampling strategy Variable selection (∀k ∈ N) x1,k activated when ε1,k = 1 x2,k activated when ε2,k = 1 x3,k activated when ε3,k = 1 x4,k activated when ε4,k = 1 x5,k activated when ε5,k = 1 x6,k activated when ε6,k = 1 How to choose (∀k ∈ N) the variable εk = (ε1,k , . . . , ε6,k )? P[εk = (1, 1, 0, 0, 0, 0)] = 0.1 P[εk = (1, 0, 1, 0, 0, 0)] = 0.2 P[εk = (1, 0, 0, 1, 1, 0)] = 0.2
  83. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 34/44 Illustration of the random sampling strategy Variable selection (∀k ∈ N) x1,k activated when ε1,k = 1 x2,k activated when ε2,k = 1 x3,k activated when ε3,k = 1 x4,k activated when ε4,k = 1 x5,k activated when ε5,k = 1 x6,k activated when ε6,k = 1 How to choose (∀k ∈ N) the variable εk = (ε1,k , . . . , ε6,k )? P[εk = (1, 1, 0, 0, 0, 0)] = 0.1 P[εk = (1, 0, 1, 0, 0, 0)] = 0.2 P[εk = (1, 0, 0, 1, 1, 0)] = 0.2 P[εk = (0, 1, 1, 1, 1, 1)] = 0.5
  84. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 35/44 Illustrating example ⋆ 3D mesh denoising problem [ANR Graphsip]
  85. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 36/44 Valued graph Valued graph (∀n ∈ N) v(1) v(2) e(1,2) v(4) e(2,4) v(3) e(2,3) v(6) e(3,6) v(5) e(5,6) v(7) e(6,7) v(8) e(7,8) ⋆ V = v(i) i ∈ {1, . . . , M} set of vertices = objects v(i) ∈ V ↔ i ∈ {1, . . . , M} ⋆ E = e(i,j) (i, j) ∈ E set of edges = object relationships e(i,j) ∈ E ↔ (i, j) ∈ E
  86. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 36/44 Valued graph Valued graph (∀n ∈ N) v(1) v(2) e(1,2) v(4) e(2,4) v(3) e(2,3) v(6) e(3,6) v(5) e(5,6) v(7) e(6,7) v(8) e(7,8) ⋆ V = v(i) i ∈ {1, . . . , M} set of vertices = objects v(i) ∈ V ↔ i ∈ {1, . . . , M} ⋆ E = e(i,j) (i, j) ∈ E set of edges = object relationships e(i,j) ∈ E ↔ (i, j) ∈ E ⋆ (xi)1 i M : weights on vertices (scalars or vectors) ⋆ weights on a vertex ≡ a block ⋆ M = p
  87. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 37/44 3D mesh denoising problem Original mesh x Observed mesh z Undirected nonreflexive graph with: V the set of vertices of the mesh E the set of edges of the mesh Objective: Estimate x = (x(i))1 i M from noisy observations z = (z(i))1 i M where, for every i ∈ {1, . . . , M}, x(i) ∈ R3 is the vector of 3D coordinates of the i-th vertex of a mesh ⋆ H = R3M
  88. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 37/44 3D mesh denoising problem Objective: Estimate x = (x(i))1 i M from noisy observations z = (z(i))1 i M where, for every i ∈ {1, . . . , M}, x(i) ∈ R3 is the vector of 3D coordinates of the i-th vertex of a mesh Cost function: Φ(x) = M j=1 ψj(x(j) − z(j)) + ιCj (x(j)) + ηj (x(j) − x(i))i∈Nj 1,2 where (∀j ∈ {1, . . . , M}), ⋆ ψj : R3 → R: ℓ2 − ℓ1 Huber function • robust data fidelity measure • convex, Lipschitz differentiable function ⋆ Cj: nonempty convex subset of R3 ⋆ Nj: neighborhood of j-th vertex ⋆ (ηj)1 j M : nonnegative regularization constants.
  89. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 37/44 3D mesh denoising problem Objective: Estimate x = (x(i))1 i M from noisy observations z = (z(i))1 i M where, for every i ∈ {1, . . . , M}, x(i) ∈ R3 is the vector of 3D coordinates of the i-th vertex of a mesh Cost function: Φ(x) = M j=1 ψj (x(j) − z(j)) + ιCj (x(j)) + ηj (x(j) − x(i))i∈Nj 1,2 Implementation details: a block ≡ a vertex ⇒ p = M = |V| Case 1: q = M (∀j ∈ {1, . . . , M}) ⋆ hj = ψj (· − z(j)) ⋆ fj = ιCj (∀l ∈ {1, . . . , M})(∀x ∈ R3M ) ⋆ gl (Ll x) = (x(l) − x(i))i∈Nl 1,2 ⋆ ll = ι{0} Case 2: q = 2M (fj ≡ 0) (∀j ∈ {1, . . . , M}) ⋆ hj = ψj (· − z(j)) (∀l ∈ {1, . . . , M})(∀x ∈ R3M ) ⋆ gl (Ll x) = (x(l) − x(i))i∈Nl 1,2 ⋆ gM+l (LM+l x) = ιCl (x(l)) ⋆ ll = ι{0}
  90. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 38/44 Simulation results (algorithm type I) ⋆ V = V1 ∪ V2 with |V1 | ≫ |V2 |. ⋆ additive independent noise with distribution V1 N(0, σ2 1 ), V2 π N(0, σ2 2 ) + (1 − π) N(0, (σ′ 2 )2), π ∈ (0, 1). ⋆ variable activation (∀j ∈ {1, . . . , M})(∀n ∈ N) P(εj,n = 1) = p if j ∈ V1 1 otherwise. Original mesh, M = 22998, Noisy mesh, MSE = 1.08 × 10−6. |V1| = 18492 and |V2| = 4506.
  91. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 38/44 Simulation results (algorithm type I) Proposed reconstruction Laplacian smoothing MSE = 2.19 × 10−7 MSE = 2.95 × 10−7
  92. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 39/44 Complexity (algorithm type I) 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 35 40 45 50 55 p C(p)
  93. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 40/44 Simulation results (algorithm type II) ⋆ positions of the original mesh are corrupted through an i.i.d. zero-mean Gaussian mixture noise model. ⋆ a limited number r of variables can be handled at each iteration, where r = p j=1 ε(j) k = r p. ⋆ mesh decomposed into p/r non-overlapping blocks. Original mesh, M = 100250. Noisy mesh, MSE = 2.89 × 10−6.
  94. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 40/44 Simulation results (algorithm type II) Proposed reconstruction Laplacian smoothing MSE = 8.09 × 10−8 MSE = 5.23 × 10−7
  95. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 41/44 Complexity behavior (algorithm type II) 100 101 102 103 5 150 300 600 1200 2400 Time (s.) p/r 50 55 60 65 70 Memory (Mb) ⋆ dashed line: required memory ⋆ continuous line: reconstruction time
  96. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 42/44 Conclusion Variable Metric Forward-Backward algorithm ⋆ Choice of the preconditioning matrices based on an MM strategy. ⋆ Convergence results in the nonconvex case. Extension to a Block-Coordinate version ⋆ Choice of the blocks according to a noncyclic updating rule.
  97. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 42/44 Conclusion Variable Metric Forward-Backward algorithm ⋆ Choice of the preconditioning matrices based on an MM strategy. ⋆ Convergence results in the nonconvex case. Extension to a Block-Coordinate version ⋆ Choice of the blocks according to a noncyclic updating rule. Stochastic Primal-Dual algorithms ⋆ Variable splitting (block-coordinate algorithms) ⋆ Function splitting ⋆ Hypergraph structure for distributed optimization ⋆ Almost sure convergence results
  98. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 42/44 Conclusion Applications ⋆ Phase reconstruction problem in the context of tomography. ⋆ Blind deconvolution problem in the context of seismic data. [Collaboration with L. Duval from IFPEN and M. Q. Pham] ⋆ 3D mesh denoising problem. [ANR Graphsip]
  99. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 42/44 Conclusion Applications ⋆ Phase reconstruction problem in the context of tomography. ⋆ Blind deconvolution problem in the context of seismic data. [Collaboration with L. Duval from IFPEN and M. Q. Pham] ⋆ 3D mesh denoising problem. [ANR Graphsip]
  100. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 43/44 ◮ E. Chouzenoux, J.-C. Pesquet, et A. Repetti. Variable metric forward-backward algorithm for minimizing the sum of a differentiable function and a convex function, Journal of Optimization Theory and Applications, vol. 162, no. 1, pp 107–132, July 2014. ◮ E. Chouzenoux, J.-C. Pesquet, et A. Repetti. A block coordinate variable metric forward-backward algorithm, Accepted for publication in Global Optimization, 2015. ◮ A. Repetti, M. Q. Pham, L. Duval, E. Chouzenoux et J.-C. Pesquet. Euclid in a taxicab: sparse blind deconvolution with smoothed ℓ1/ℓ2 regularization, Signal Processing Letters, vol. 22, no. 5, pp. 539–543, May 2015. ◮ A. Repetti, E. Chouzenoux et J.-C. Pesquet. A nonconvex regularized approach for phase retrieval. In Proceeding of ICIP, pp. 1753–1757, Paris, France, 27-30 Oct. 2014.
  101. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 43/44 ◮ E. Chouzenoux, J.-C. Pesquet, et A. Repetti. Variable metric forward-backward algorithm for minimizing the sum of a differentiable function and a convex function, Journal of Optimization Theory and Applications, vol. 162, no. 1, pp 107–132, July 2014. ◮ E. Chouzenoux, J.-C. Pesquet, et A. Repetti. A block coordinate variable metric forward-backward algorithm, Accepted for publication in Global Optimization, 2015. ◮ A. Repetti, M. Q. Pham, L. Duval, E. Chouzenoux et J.-C. Pesquet. Euclid in a taxicab: sparse blind deconvolution with smoothed ℓ1/ℓ2 regularization, Signal Processing Letters, vol. 22, no. 5, pp. 539–543, May 2015. ◮ A. Repetti, E. Chouzenoux et J.-C. Pesquet. A nonconvex regularized approach for phase retrieval. In Proceeding of ICIP, pp. 1753–1757, Paris, France, 27-30 Oct. 2014. ◮ J.-C. Pesquet et A. Repetti. A class of randomized primal-dual algorithms for distributed optimization, Journal of Nonlinear and Convex Analysis, vol. 16, no. 12, Dec. 2015. ◮ A. Repetti, E. Chouzenoux et J.-C. Pesquet. A random block-coordinate primal-dual proximal algorithm with application to 3D mesh denoising. In Proceeding of ICASSP, Brisbane, Australia, 21-25 Apr. 2015. ◮ A. Repetti, E. Chouzenoux et J.-C. Pesquet. A parallel block-coordinate approach for primal-dual splitting with arbitrary random block selection. In Proceeding of EUSIPCO, Nice, France, Sept. 2015.
  102. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 44/44 Open questions: ⋆ How to improve stochastic approaches to avoid useless communications between nodes/machines? ⋆ Can we adapt the MM approach to choose the preconditioning matrices in primal-dual methods? ⋆ Can we prove the convergence of primal-dual algorithms in a non convex context? ⋆ How to prove the convergence of non convex block-coordinate methods using random updating rules?
  103. Introduction Accelerated FB algorithm and nonconvexity Stochastic FB algorithm and

    convexity Conclusion Solving large-scale inverse problems using forward-backward based methods 44/44 Open questions: ⋆ How to improve stochastic approaches to avoid useless communications between nodes/machines? ⋆ Can we adapt the MM approach to choose the preconditioning matrices in primal-dual methods? ⋆ Can we prove the convergence of primal-dual algorithms in a non convex context? ⋆ How to prove the convergence of non convex block-coordinate methods using random updating rules? Thank you for your attention