Upgrade to Pro — share decks privately, control downloads, hide ads and more …

GRAPHADON 2024

GRAPHADON 2024

Course on graph signal processing at the GRAPHADON summer school

Olivier Lézoray

September 15, 2024
Tweet

More Decks by Olivier Lézoray

Other Decks in Research

Transcript

  1. GRAPH SIGNAL PROCESSING : FROM IMAGES TO ARBITRARY GRAPHS Olivier

    L´ ezoray Normandie Univ, UNICAEN, ENSICAEN, CNRS, GREYC, Caen, FRANCE [email protected] https://lezoray.users.greyc.fr
  2. 1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs

    4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 2 / 96
  3. 1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs

    4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 3 / 96
  4. The data deluge - Graphs everywhere With the data deluge,

    graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96
  5. The data deluge - Graphs everywhere With the data deluge,

    graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Images (grid graphs), Image partitions (superpixels graphs) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96
  6. The data deluge - Graphs everywhere With the data deluge,

    graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Meshes, 3D point clouds Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96
  7. The data deluge - Graphs everywhere With the data deluge,

    graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Social Networks: Facebook, LinkedIn Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96
  8. The data deluge - Graphs everywhere With the data deluge,

    graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Biological Networks, Brain Graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96
  9. The data deluge - Graphs everywhere With the data deluge,

    graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Mobility networks : NYC Taxi, Velo’V Lyon Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96
  10. Graph signals ▶ We consider discrete domains Ω (2D: images,

    3D: meshes, nD: manifolds) represented by graphs G = (V, E) carrying multivariate signals f : G → Rn ▶ Graphs can be oriented or undirected, and carry weights on edges. Their topology is arbitrary. f1 : G1 → Rn=3 f2 : G2 → Rn=3 f3 : G3 → Rn=21×21 f4 : G4 → Rn=∗ Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 5 / 96
  11. Scientific issues Usual ways of processing data from graphs ▶

    Graph theory ➲ spectral analysis (for data processing : proximity graphs built from data) ▶ Variational and morphological methods (for signal and image processing: Euclidean graphs imposed by the domain) ▶ Emergence of a new research field called Graph Signal Processing (GSP) ▶ Objective: development of algorithms to process data that reside on the vertices (or edges) of a graph: signals on graphs ▶ Problem: how to process general (non-Euclidean) graphs with signal processing techniques? ▶ Many recent works aim at extending signal and image processing tools to graph-based signal processing: the same algorithm for any kind of graph signal ! Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 6 / 96
  12. Graph signal processing: a very active field R´ ef´ erences

    ▶ David I. Shuman, Sunil K. Narang, Pascal Frossard, Antonio Ortega, Pierre Vandergheynst, The Emerging Field of Signal Processing on Graphs: Extending High-Dimensional Data Analysis to Networks and Other Irregular Domains. IEEE Signal Process. Mag. 30(3): 83-98, 2013. ▶ A. Ortega, P. Frossard, J. Kovaˇ cevi´ c, J. M. F. Moura and P. Vandergheynst, Graph Signal Processing: Overview, Challenges, and Applications, Proceedings of the IEEE, 106(5): 808-828, 2018. ▶ W. Hu, J. Pang, X. Liu, D. Tian, C. -W. Lin and A. Vetro, Graph Signal Processing for Geometric Data and Beyond: Theory and Applications, in IEEE Transactions on Multimedia, 2021. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 7 / 96
  13. What is it for? Problems : from low to high

    levels ▶ Compression: ➲ Wavelets for signals on graphs ▶ Completion: ➲ Inpainting of signals on graphs ▶ Denoising : ➲ Filtering of signals on graphs ▶ Manipulation: ➲ Enhancement of signals on graphs ▶ Segmentation: ➲Partitioning of signals on graphs ▶ Classification: ➲ recognize graph signals types Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 8 / 96
  14. 1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs

    4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 9 / 96
  15. Image filtering The basic ingredient of image filtering: convolution (i-1,j-1)

    (i,j-1) (i+1,j-1) (i-1,j) (i,j) (i+1,j) (i-1,j+1) (i,j+1) (i+1,j+1) I′(i, j) = v di=−v v dj=−v I(i + di, j + dj) ⊗ filter(di, dj) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 10 / 96
  16. Same weights everywhere Apply a filter with fixed weights Olivier

    L´ ezoray Graph signal processing : from images to arbitrary graphs 11 / 96
  17. Data-adaptive weights The weights depend on the image content Olivier

    L´ ezoray Graph signal processing : from images to arbitrary graphs 12 / 96
  18. How to define these weights ? Classical Gaussian filters: Olivier

    L´ ezoray Graph signal processing : from images to arbitrary graphs 13 / 96
  19. How to define these weights ? Data-dependent weights: Olivier L´

    ezoray Graph signal processing : from images to arbitrary graphs 14 / 96
  20. Examples of data-dependent filters The bilateral and the non-local means

    Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 16 / 96
  21. The image as a graph ▶ The matrix W is

    a representation of the image as a graph ▶ The construction of the graph implies local or non-local filtering Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 18 / 96
  22. The image as a graph ▶ The matrix W is

    a representation of the image as a graph ▶ The construction of the graph implies local or non-local filtering Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 18 / 96
  23. The image as a graph ▶ The matrix W is

    a representation of the image as a graph ▶ The construction of the graph implies local or non-local filtering Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 18 / 96
  24. Image filtering ▶ Filtering a noisy image : ▶ The

    clean image can be recovered by kernel regression (Takeda, Farsiu, Milanfar, ’07): ˆ zi = arg minzi n j=1 (yj − zi)2Kij ▶ Depending on the kernel K, a local or non-local filtering is performed ▶ The solution is obtained by a Nadaraya-Watson Kernel regression : ˆ zi = j Kij j Kij yj = wT i y Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 19 / 96
  25. Link with the Graph Laplacian ▶ The graph Laplacian at

    a pixel zi is defined as: L(zi) = j Kij(zi − zj) This measures the image smoothness at zi ▶ And can be rewritten as: L(zi) = zi j Kij − j Kijzj ▶ Enforce smoothness : L(zi) = 0 → zi = j Kij zj j Kij = wT i z P. Milanfar, ”A Tour of Modern Image Filtering”, 2013 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 20 / 96
  26. Image Processing & Graphs ▶ Image filtering can be seen

    as a data-dependent kernel regression on a graph ▶ The topology of the graph implies local or non-local processing ▶ The weights of the graph imply a data-dependent processing ▶ However, images are very specific data organized in a grid on a Euclidean domain ▶ Can we generalize this to signals defined on non-Euclidean domains ? ▶ Can we also generalize other image processing tasks (segmentation, interpolation) to signals on arbitrary graphs ? Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 21 / 96
  27. 1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs

    4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 22 / 96
  28. Weighted graphs Basics ▶ A weighted graph G = (V,

    E, w) consists in a finite set V = {v1 , . . . , vN } of N vertices Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 23 / 96
  29. Weighted graphs Basics ▶ A weighted graph G = (V,

    E, w) consists in a finite set V = {v1 , . . . , vN } of N vertices ▶ and a finite set E = {e1 , . . . , eN′ } ⊂ V × V of N′ weighted edges. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 23 / 96
  30. Weighted graphs Basics ▶ A weighted graph G = (V,

    E, w) consists in a finite set V = {v1 , . . . , vN } of N vertices ▶ and a finite set E = {e1 , . . . , eN′ } ⊂ V × V of N′ weighted edges. ▶ eij = (vi , vj ) is the edge of E that connects vertices vi and vj of V. Its weight, denoted by wij = w(vi , vj ), represents the similarity between its vertices. ▶ Similarities are usually computed by using a positive symmetric function w : V × V → R+ satisfying w(vi , vj ) = 0 if (vi , vj ) / ∈ E. w w w w w w w w w w w Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 23 / 96
  31. Weighted graphs Basics ▶ A weighted graph G = (V,

    E, w) consists in a finite set V = {v1 , . . . , vN } of N vertices ▶ and a finite set E = {e1 , . . . , eN′ } ⊂ V × V of N′ weighted edges. ▶ eij = (vi , vj ) is the edge of E that connects vertices vi and vj of V. Its weight, denoted by wij = w(vi , vj ), represents the similarity between its vertices. ▶ Similarities are usually computed by using a positive symmetric function w : V × V → R+ satisfying w(vi , vj ) = 0 if (vi , vj ) / ∈ E. ▶ The notation vi ∼ vj is used to denote two adjacent vertices. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 23 / 96
  32. Space of functions on Graphs ▶ H(V) and H(E) are

    the Hilbert spaces of graph signals: real-valued functions defined on the vertices or the edges of a graph G. ▶ A function f : V → R of H(V) assigns a real value xi = f(vi) to vi ∈ V. ▶ By analogy with functional analysis on continuous spaces, the integral of a function f ∈ H(V), over the set of vertices V, is defined as V f = V f ▶ Both spaces H(V) and H(E) are endowed with the usual inner products: ⟨f, h⟩H(V) = vi∈V f(vi)h(vi), where f, h : V → R ⟨F, H⟩H(E) = vi∈V vj∼vi F(vi, vj)H(vi, vj) where F, H : E → R Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 24 / 96
  33. Difference operators on weighted graphs ➲ Discrete analogue on graphs

    of classical continuous differential geometry. The difference operator of f, dw : H(V) → H(E), is defined on an edge eij = (vi, vj) ∈ E by: (dwf)(eij) = (dwf)(vi, vj) = w(vi, vj)1/2(f(vj) − f(vi)) . (1) The adjoint of the difference operator, d∗ w : H(E) → H(V), is a linear operator defined by ⟨dwf, H⟩H(E) = ⟨f, d∗ w H⟩H(V) and expressed by (d∗ w H)(vi) = −divw(H)(vi) = vj∼vi w(vi, vj)1/2(H(vj, vi) − H(vi, vj)) . (2) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 25 / 96
  34. Difference operators on weighted graphs The directional derivative (or edge

    derivative) of f, at a vertex vi ∈ V, along an edge eij = (vi, vj), is defined as ∂f ∂eij vi = ∂vj f(vi) = (dwf)(vi, vj) = w(vi, vj)1/2(f(vj) − f(vi)) Weighted finite difference correspond to forward differences on grid-graph images with w(vi, vj) = 1 h2 with h2 the discretization step: ∂f ∂x (vi) = f(vi + h2) − f(vi) h2 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 26 / 96
  35. Weighted gradient operator The weighted gradient operator of a function

    f ∈ H(V), at a vertex vi ∈ V, is the vector operator defined by (∇wf)(vi ) = [(dwf)(vi, vj) : vj ∈ V]T . (3) ➲ The gradient considers all vertices vj ∈ V and not only vj ∼ vi. The Lp norm of this vector represents the local variation of the function f at a vertex of the graph (It is a semi-norm for p ≥ 1): ∥(∇wf)(vi )∥p = vj∼vi wp/2 ij f(vj)−f(vi) p 1/p . (4) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 27 / 96
  36. Isotropic p-Laplacian The weighted p-Laplace isotropic operator of a function

    f ∈ H(V), noted ∆i w,p : H(V) → H(V), is defined by: (∆i w,p f)(vi) = 1 2 d∗ w (∥(∇wf)(vi )∥p−2 2 (dwf)(vi, vj)) . (5) The isotropic p-Laplace operator of f ∈ H(V), at a vertex vi ∈ V, can be computed by: (∆i w,p f)(vi) = 1 2 vj∼vi (γi w,p f)(vi, vj)(f(vi) − f(vj)) , (6) with (γi w,p f)(vi, vj) = wij ∥(∇wf)(vj )∥p−2 2 + ∥(∇wf)(vi )∥p−2 2 . (7) The p-Laplace isotropic operator is nonlinear, except for p = 2 (corresponds to the combinatorial Laplacian). For p = 1, it corresponds to the weighted curvature of the function f on the graph. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 28 / 96
  37. Anisotropic p-Laplacian The weighted p-Laplace anisotropic operator of a function

    f ∈ H(V), noted ∆a w,p : H(V) → H(V), is defined by: (∆a w,p f)(vi) = 1 2 d∗ w (|(dwf)(vi, vj)|p−2(dwf)(vi, vj)) . (8) The anisotropic p-Laplace operator of f ∈ H(V), at a vertex vi ∈ V, can be computed by: (∆a w,p f)(vi) = vj∼vi (γa w,p f)(vi, vj)(f(vi) − f(vj)) . (9) with (γa w,p f)(vi, vj) = wp/2 ij |f(vi) − f(vj)|p−2 . (10) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 29 / 96
  38. Constructing graphs Any discrete domain can be modeled by a

    weighted graph where each data point is represented by a vertex vi ∈ V. Unorganized data An unorganized set of points V ⊂ Rn can be seen as a function f0 : V → Rm. The set of edges is defined by modeling the neighborhood of each vertex based on similarity relationships between feature vectors. Typical graphs: k-nearest neighbors graphs and ϵ-neighborhood graphs. Organized data Typical cases of organized data are signals, gray-scale or color images (in 2D or 3D). The set of edges is defined by spatial relationships. Such data can be seen as functions f0 : V ⊂ Zn → Rm. Typical graphs: images’ grid graphs, region graphs, 3D meshes. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 30 / 96
  39. Weighting graphs For an initial function f0 : V →

    Rm, similarity relationship between data can be incorporated within edges weights according to a measure of similarity g : E → [0, 1] with w(eij) = g(eij), ∀eij ∈ E. Each vertex vi is associated with a feature vector Ff0 τ : V → Rm×q where q corresponds to this vector size: Ff0 τ (vi) = f0(vj) : vj ∈ Nτ (vi) ∪ {vi } T (11) with Nτ (vi) = vj ∈ V \ {vi } : µ(vi, vj) ≤ τ . For an edge eij and a distance measure ρ : Rm×q×Rm×q → R associated to Ff0 τ , we can have: g1(eij) =1 (unweighted case) , g2(eij) = exp −ρ Ff0 τ (vi), Ff0 τ (vj) 2/σ2 with σ > 0 , g3(eij) =1/ 1 + ρ Ff0 τ (vi), Ff0 τ (vj) (12) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 31 / 96
  40. Graph topology Digital Image 8-neighborhood: 3 × 3 24-neighborhood: 5

    × 5 A value is associated to vertices A patch is the vector of values in a given neighborhood of a ver- tex. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 32 / 96
  41. With Graphs With Graphs ▶ Nonlocal behavior is directly expressed

    by the graph topology. ▶ Patches are used to measure similarity between vertices. Consequences ▶ Nonlocal processing of images becomes local processing on similarity graphs. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 33 / 96
  42. With Graphs Examples of graphs for an image. From left

    to right: original image, a symmetric 8-grid graph, 10-nearest neighbor graphs (inside a 11 × 11 window with color-based or 3 × 3 patch-based distances). Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 34 / 96
  43. 1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs

    4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 35 / 96
  44. p-Laplacian nonlocal regularization on graphs Let f0 : V →

    R be the noisy version of a clean graph signal g : V → R defined on the vertices of a weighted graph G = (V, E, w). To recover g, seek for a function f : V → R regular enough on G, and close enough to f0, with the following variational problem: g ≈ min f:V→R E∗ w,p (f, f0, λ) = R∗ w,p (f) + λ 2 ∥f − f0∥2 2 , (13) where the regularization functional R∗ w,p : H(V) → R can correspond to an isotropic Ri w,p or an anisotropic Ra w,p functionnal. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 36 / 96
  45. Isotropic and anisotropic regularization terms The isotropic regularization functionnal Ri

    w,p is defined by the L2 norm of the gradient and is the discrete p-Dirichlet form of the function f ∈ H(V): Ri w,p (f) = 1 p vi∈V ∥(∇wf)(vi )∥p 2 = 1 p ⟨f, ∆i w,p f⟩H(V) = 1 p vi∈V   vj∼vi wij(f(vj) − f(vi))2   p 2 . (14) The anisotropic regularization functionnal Ra w,p is defined by the Lp norm of the gradient: Ra w,p (f) = 1 p vi∈V ∥(∇wf)(vi )∥p p = 1 p ⟨f, ∆a w,p f⟩H(V) = 1 p vi∈V vj∼vi wp/2 ij |f(vj) − f(vi)|p . (15) When p ≥ 1, the energy E∗ w,p is a convex functional of functions of H(V). Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 37 / 96
  46. Isotropic/Anisotropic diffusion processes To get the solution of the minimizer,

    we consider the following system of equations: ∂E∗ w,p (f, f0, λ) ∂f(vi) = 0, ∀vi ∈ V (16) which is rewritten as: ∂R∗ w,p (f) ∂f(vi) + λ(f(vi) − f0(vi)) = 0, ∀vi ∈ V. (17) Moreover, we can prove that ∂Ri w,p (f) ∂f(vi) = 2(∆i w,p f)(vi) and ∂Ra w,p (f) ∂f(vi) = (∆a w,p f)(vi) . (18) The system of equations is then rewritten as  λ + vj∼vi (γ∗ w,p f)(vi, vj)   f(vi) − vj∼vi (γ∗ w,p f)(vi, vj)f(vj) = λf0(vi). (19) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 38 / 96
  47. Isotropic/Anisotropic diffusion processes We can use the linearized Gauss-Jacobi iterative

    method to solve the previous systems. Let n be an iteration step, and let f(n) be the solution at the step n. Then, the method is given by the following algorithm:            f(0) = f0 f(n+1)(vi) = λf0(vi) + vj∼vi (γ∗ w,p f(n))(vi, vj)f(n)(vj) λ + vj∼vi (γ∗ w,p f(n))(vi, vj) , ∀vi ∈ V. (20) with (γi w,p f)(vi, vj) = wij ∥(∇wf)(vj )∥p−2 2 + ∥(∇wf)(vi )∥p−2 2 , (21) and (γa w,p f)(vi, vj) = wp/2 ij |f(vi) − f(vj)|p−2 . (22) It describes a family of discrete diffusion processes, which is parameterized by the structure of the graph (topology and weight function), the parameter p, and the parameter λ. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 39 / 96
  48. Different kinds of regularizers Depending on p we can obtain

    different kinds of regularizers. Let’s start with p = 2 ▶ Ri w,2 (f) = 1 2 vi∈V vj∼vi wij(f(vj) − f(vi))2 = fT Lf This is the Graph Laplacian Regularizer (GLR) with L = D − W. ▶ The signal is smooth if the GLR is small : for large edge weights, f(vj) and f(vi) are similar. For small edge weight, they can differ signicatively. ▶ Also possesses an interpretation in the frequency domain : fT Lf = N k=1 λk ˆ f2 k where λk is the k-th eigenvalue and ˆ fk = VT f the k-th GFT (Graph Fourier Transform) coefficient (L = VΣVT ). The GLR is small if the signal energy is occupied by low-frequency components. ▶ The regularization is linear has a closed form solution: f∗ = (λI + L)−1f (Zhou, Sch¨ olkopf, ’06) it performs adaptive low-pass filtering. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 40 / 96
  49. Different kinds of regularizers Now p = 1 ! ▶

    Ri w,1 (f) = vi∈V vj∼vi wij(f(vj) − f(vi))2 1 2 Called the isotropic Graph Total Variation (GTV) ▶ Ra w,1 (f) = vi∈V vj∼vi w1/2 ij |f(vj) − f(vi)| Called the anisotropic Graph Total Variation (GTV) ▶ The GTV is a stronger piecewise smooth prior than the GLR ▶ No closed form-solution as the GTV is non-differentiable ▶ Can be solved more efficiently than with the iterative Gauss-Jacobi process with primal dual algorithms (Chambolle-Pock, ADMM) or the Cut-Pursuit (to be presented) Note : In the GLR and GTV, the graph weights are fixed. There exists other priors where the weights are updated during the minimization (from works of Gene Cheung), called Reweighted GLR and GTV. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 41 / 96
  50. Examples: Image denoising Original image Noisy image (Gaussian noise with

    σ = 15) f0 : V → R3 PSNR=29.38dB Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 42 / 96
  51. Examples: Image denoising Isotropic G1, Ff0 0 = f0 Isotropic

    G7, Ff0 3 Anisotropic G7, Ff0 3 p = 2 PSNR=28.52db PSNR=31.79dB PSNR=31.79dB p = 1 PSNR=31.25dB PSNR=34.74dB PSNR=31.81dB Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 43 / 96
  52. Examples: Mesh simplification Original Mesh Isotropic, p = 2 Isotropic,

    p = 1, Anisotropic, p = 1 f0 : V → R3 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 44 / 96
  53. Examples: Colored Mesh simplification Original Colored Mesh λ = 1

    λ = 0.5 f0 : V → R3 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 45 / 96
  54. Examples: Point cloud denoising 2D Patches on 3D Point clouds

    Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 46 / 96
  55. Examples: Point Cloud denoising Initial Point cloud Noisy Spectral low-pass

    filter Nodal GTV filtering Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 47 / 96
  56. Examples: Colored Point Cloud denoising Initial Point cloud Noisy Local

    Graph Non Local Graph f0 : V → R3 4-NNG 200-NNG, Ff0 9 127039 points Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 48 / 96
  57. Examples: Mesh denoising The anisotropic GTV prior of surface normals

    is used over a 80-NN graph with Gaussian weights on vertices coordinates. Original noisy denoised Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 49 / 96
  58. Examples: Image Database denoising Initial data Noisy data 10-NNG f0

    : V → R16×16 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 50 / 96
  59. Examples: Image Database denoising λ = 1 λ = 0.01

    λ = 0 Isotropic p = 1 PSNR=18.80dB PSNR=13.54dB PSNR=10.52dB Anisotropic p = 1 PSNR=18.96dB PSNR=15.19dB PSNR=14.41dB Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 51 / 96
  60. Hierarchical decomposition of graph signals ▶ The graph signal is

    decomposed into a base layer and several detail layers capturing each a given level of detail : f = l−2 i=0 fi + dl−1 ▶ Each layer is obtained by fi = GTVR(di1 ), d−1 = f, and di = di−1 − fi ▶ The sequence of scales is decreaseing λ0 < λ1 < · · · < λl−2 ▶ The signal can be reconstructed from the hierarchical decomposition with linear manipulation of the layers ˆ f(vk) = f0(vk) + l−1 i=1 αifi(vk) with fl−1 = dl−1 ▶ Each layer is boosted is αi > 1. ▶ This is an extension of a hierarchical framework that was proposed by (Tadmor, Nezzar, Vese, ’04) to graph signals on arbitrary graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 52 / 96
  61. Exemple of decomposition Nonlocal decomposition with a 10-NNG, Ff0 2

    . Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 53 / 96
  62. Hierarchical decomposition by iterative regularization Original Image Removing layers u1

    to u3 and u6 to u9 removes acne removes freckles Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 54 / 96
  63. Mesh enhancement Original Mesh Coarse Mesh Intermediate Mesh Enhanced Mesh

    Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 55 / 96
  64. High Quality Colored Mesh enhancement 553053 vertices, 1105611 faces Original

    scan Enhanced scan Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 56 / 96
  65. Interpolation of missing data on graphs Let f0 : V0

    → R be a function with V0 ⊂ V be the subset of vertices from the whole graph with known values. The interpolation consists in recovering values of f for the vertices of V \ V0 given values for vertices of V0 formulated by: min f:V→R R∗ w,p (f) + λ(vi)∥f(vi) − f0(vi)∥2 2 . (23) Since f0(vi) is known only for vertices of V0, the Lagrange parameter is defined as λ : V → R: λ(vi) = λ if vi ∈ V0 0 otherwise. (24) This comes to consider ∆∗ w,p f(vi) = 0 on V \ V0. Isotropic and anisotropic diffusion processes can be directly used to perform the interpolation. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 57 / 96
  66. Examples: Image segmentation Solve ∆∗ w,p f(vi) = 0 on

    V \ V0. (a) 27 512 pixels (b) Original+Labels (c) t = 50 (11 seconds) (d) 639 zones (98% of reduc- tion) (e) Original+Labels (f) t = 5 (< 1 second) (g) 639 zones (98% of reduc- tion) (h) Original+Labels (i) t = 2 (< 1 second) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 58 / 96
  67. Examples: Data base clustering (a) Données initiales avec les marqueurs

    initiaux (b) 20-Nng n = 1 n = 5 n = 10 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 Fig. 3.16 – Exemple de propagation de labels sur un nuage de points pour la clas- sification semi supervisée de données. (a) : données initiales avec les marqueurs initiaux et (b) : le graphe associé. À la dernière ligne : évolution de la classification en fonction du nombre d’itérations n. Fig. 3.17 – Image initiale avec marqueurs initiaux servant d’image de test pour la segmentation semi supervisée non locale isotrope et anisotrope présentée dans la figure 3.18. Nos modèles isotrope et anisotrope de la classification semi supervisée sont défi- nis avec des graphes et unifient configurations locales et non locales dans le contexte du traitement des images. Nous pouvons remarquer que la segmentation non locale avec des patchs est peu étudiée et utilisée dans la littérature. Nous pouvons citer quelques travaux récents définis dans le domaine continu [?,?]. La figure 3.18 illustre ce type de segmentation des images et compare les mo- dèles isotrope et anisotrope avec différents graphes et différentes valeurs du para- mètre p (dans le cas où le paramètre p = 2, les modèles isotrope et anisotrope sont 3.3 Problèmes d’interpolation basés sur la régularisation 79 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 (a) Données initiales avec les marqueurs initiaux ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 (b) 20-Nng n = 1 n = 5 n = 10 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 Fig. 3.16 – Exemple de propagation de labels sur un nuage de points pour la clas- sification semi supervisée de données. (a) : données initiales avec les marqueurs initiaux et (b) : le graphe associé. À la dernière ligne : évolution de la classification en fonction du nombre d’itérations n. Fig. 3.17 – Image initiale avec marqueurs initiaux servant d’image de test pour la segmentation semi supervisée non locale isotrope et anisotrope présentée dans la figure 3.18. Nos modèles isotrope et anisotrope de la classification semi supervisée sont défi- nis avec des graphes et unifient configurations locales et non locales dans le contexte du traitement des images. Nous pouvons remarquer que la segmentation non locale avec des patchs est peu étudiée et utilisée dans la littérature. Nous pouvons citer quelques travaux récents définis dans le domaine continu [?,?]. La figure 3.18 illustre ce type de segmentation des images et compare les mo- dèles isotrope et anisotrope avec différents graphes et différentes valeurs du para- mètre p (dans le cas où le paramètre p = 2, les modèles isotrope et anisotrope sont 3.3 Problèmes d’interpolation basés sur la régularisation 79 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 (a) Données initiales avec les marqueurs initiaux ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 (b) 20-Nng n = 1 n = 5 n = 10 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 Fig. 3.16 – Exemple de propagation de labels sur un nuage de points pour la clas- sification semi supervisée de données. (a) : données initiales avec les marqueurs initiaux et (b) : le graphe associé. À la dernière ligne : évolution de la classification en fonction du nombre d’itérations n. Fig. 3.17 – Image initiale avec marqueurs initiaux servant d’image de test pour la segmentation semi supervisée non locale isotrope et anisotrope présentée dans la figure 3.18. Nos modèles isotrope et anisotrope de la classification semi supervisée sont défi- nis avec des graphes et unifient configurations locales et non locales dans le contexte du traitement des images. Nous pouvons remarquer que la segmentation non locale avec des patchs est peu étudiée et utilisée dans la littérature. Nous pouvons citer quelques travaux récents définis dans le domaine continu [?,?]. La figure 3.18 illustre ce type de segmentation des images et compare les mo- dèles isotrope et anisotrope avec différents graphes et différentes valeurs du para- mètre p (dans le cas où le paramètre p = 2, les modèles isotrope et anisotrope sont Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 59 / 96
  68. Examples: Image colorization Gray level image Color scribbles Compute Weights

    from the gray-level image, interpolation is performed in a chrominance color space from the seeds: fc(vi ) = fs 1 (vi ) fl(vi ) , fs 2 (vi ) fl(vi ) , fs 3 (vi ) fl(vi ) T Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 60 / 96
  69. Examples: Image colorization p = 1, G1, Ff0 0 =

    f0 p = 1, G5, Ff0 2 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 61 / 96
  70. Examples: 3D Point Cloud colorization p = 1, G25, Ff0

    9 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 62 / 96
  71. 1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs

    4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 63 / 96
  72. Introduction - Mathematical Morphology (Algebraic) Fundamental operators in Mathematical Morphology

    (MM) are dilation and erosion. Dilation δ of a function f0 : Ω ⊂ R2 → R consists in replacing the function value by the maximum value within a structuring element B such that: δB f0(x, y) = max f0(x + x′, y + y′)|(x′, y′) ∈ B Erosion ϵ is computed by: ϵB f0(x, y) = min f0(x + x′, y + y′)|(x′, y′) ∈ B Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 64 / 96
  73. Introduction - Complete Lattice ▶ MM needs an ordering relation

    within vectors: a complete lattice (T , ≤) ▶ MM is problematic for multivariate data since there is no natural ordering for vectors ▶ The framework of h-orderings can be considered for that : construct a mapping h from T to L where L is a complete lattice equipped with the conditional total ordering h : T → L and v → h(v), ∀(vi, vj) ∈ T × T vi ≤h vj ⇔ h(vi) ≤ h(vj) . ▶ ≤h denotes such an h-ordering, it is a dimensionality reduction operation h : Rn → Rp with p < n. ▶ Advantage : the learned lattice depends of the signal content and is more adaptive. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 65 / 96
  74. Manifold-based ordering × Problem : the projection operator h cannot

    be linear since a distortion of the space is inevitable ! Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 66 / 96
  75. Manifold-based ordering × Problem : the projection operator h cannot

    be linear since a distortion of the space is inevitable ! ✓ Solution : Consider non-linear dimensionality reduction with Laplacian Eigenmaps that corresponds to learn the manifold where the vectors live. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 66 / 96
  76. Laplacian Eigenmaps ▶ Laplacian Eigenmaps intend to embed the data

    T in a lower dimensional space in such a way that close/similar points in T remain close in the low dimensional space: h : Rn → Rp ▶ The similarity between vectors in the original space is encoded by the graph weights. ▶ Build a k-NN neighborhood graph on T ▶ Assign weights to edges with a Gaussian kernel ▶ Find Y = {y1 , · · · , yN } ∈ Rp that minimizes E(Y) = i,j ∥yi − yj ∥2 2 wij = 2YT LY ▶ The result can be obtained by the eigenvectors of the Laplacian L = D − W (same than spectral clustering) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 67 / 96
  77. Laplacian Eigenmaps This maps a graph to a line for

    each eigenvector Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 68 / 96
  78. Manifold-based ordering × Problem : LE Non-linear dimensionality reduction directly

    on the set T of vectors is not tractable in reasonable time ! : Eigenvector decomposition of L is O(N3) with N the number of vertices. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 69 / 96
  79. Manifold-based ordering × Problem : LE Non-linear dimensionality reduction directly

    on the set T of vectors is not tractable in reasonable time ! : Eigenvector decomposition of L is O(N3) with N the number of vertices. ✓ Solution : Consider a more efficient strategy. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 69 / 96
  80. Manifold-based ordering × Problem : LE Non-linear dimensionality reduction directly

    on the set T of vectors is not tractable in reasonable time ! : Eigenvector decomposition of L is O(N3) with N the number of vertices. ✓ Solution : Consider a more efficient strategy. Three-Step Strategy ▶ Dictionary Learning to produce a set D from the set of initial vectors T ▶ Laplacian Eigenmaps Manifold Learning on the dictionary D to obtain a projection operator hD ▶ Out of sample extension to extrapolate hD to T and define h Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 69 / 96
  81. Graph signal representation Given the complete lattice (T , ≤h),

    a sorted permutation P of T is constructed P = {v′ 1 , · · · , v′ m } with v′ i ≤h v′ i+1 , ∀i ∈ [1, (m − 1)]. From the ordering, an index signal I : Ω ⊂ Z2 → [1, m] is defined as: I(pi ) = {k | v′ k = f(pi ) = vi } . Image of 256 colors Index Image (T , ≤h) The pair (I, P) provides a new graph signal representation (the index and the palette of ordered vectors). The original signal f can be directly recovered since f(pi ) = P[I(pi )] = vi. To process the graph signal: g(f(vi)) = P[g(I(vi))] with g an operation. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 70 / 96
  82. Algebraic MM for graphs signals The erosion and the dilation

    of a signal f on a graph au noeud vi ∈ G with a structuring element Bk ⊂ G are: ϵBk (f)(vi) = {P[∧I(vj)], vj ∈ Bk(vi)} δBk (f)(vi) = {P[∨I(vj)], vj ∈ Bk(vi)} . A structuring element Bk(vi) contains the k-hop nodes of vi : Bk(vi) = {vj ∼ vi } ∪ {vi } if k = 1 Bk−1(vi) ∪ ∪∀vl∈Bk−1 (vi ) B1(vl) if k ≥ 2 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 71 / 96
  83. Processing examples Original image f ϵBk (f) δBk (f) γBk

    (f) = δBk (ϵBk (f)) ϕBk (f) = ϵBk (δBk (f)) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 73 / 96
  84. Processing examples Original colored mesh f ϵBk (f) δBk (f)

    γBk (f) = δBk (ϵBk (f)) ϕBk (f) = ϵBk (δBk (f)) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 73 / 96
  85. Image and Mesh abstraction Performed with an OCCO filter. Olivier

    L´ ezoray Graph signal processing : from images to arbitrary graphs 74 / 96
  86. Image and Mesh abstraction Performed with an OCCO filter. Olivier

    L´ ezoray Graph signal processing : from images to arbitrary graphs 74 / 96
  87. Morphological Tone Mapping Durand & Dorsey MM Tone Mapping Olivier

    L´ ezoray Graph signal processing : from images to arbitrary graphs 75 / 96
  88. Graph signal multi-layer decomposition We propose the following multi-layer morphological

    decomposition of a graph signal into l layers. The graph signal is decomposed into a base layer and several detail layers, each capturing a given scale of details. d−1 = f, i = 0 while i < l do Compute the graph signal representation at level i − 1: di−1 = (Ii−1, Pi−1) Morphological Filtering of di−1: fi = MFBl−i (di−1) Compute the residual (detail layer): di = di−1 − fi Proceed to next layer: i = i + 1 end while Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 76 / 96
  89. Graph signal multi-layer decomposition ▶ The graph signal can then

    be represented by f = l−2 i=0 fi + dl−1 ▶ To extract the successive layers in a coherent manner, the sequence of scales should be decreasing ▶ ➲ Bl−i is a sequence of structuring elements of decreasing sizes with i ∈ [0, l − 1] ▶ Each detail layer di is computed on a different set of vectors than the previous layer di−1 ▶ ➲ The graph signal representation (Ii, Pi) is computed for the successive layers ▶ The considered Morphological Filter should be suitable for a multi scale analysis ▶ ➲ Use of OCCO filter : OCCOBk (f) = γBk (ϕBk (f))+ϕBk (γBk (f)) 2 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 77 / 96
  90. Decomposition examples f f0 f1 f2 f3 d3 Olivier L´

    ezoray Graph signal processing : from images to arbitrary graphs 78 / 96
  91. Decomposition examples f f0 f1 f2 f3 d3 Olivier L´

    ezoray Graph signal processing : from images to arbitrary graphs 78 / 96
  92. Graph signal enhancement ▶ The graph signal can be enhanced

    by manipulating the different layers with specific coefficients and adding the modified layers altogether. ˆ f(vk) = f0(vk) + M(vk) · l−1 i=1 Si(fi(vk)) with fl−1 = dl−1 (25) ▶ Each layer is manipulated by a nonlinear function Si(x) = 1 1+exp(−αi x) for detail enhancement and tone manipulation. ▶ The parameter αi of the sigmoid is automatically determined and decreases while i increases: αi = α i+1 ▶ A structure mask M prevents boosting noise and artifacts while enhancing the main structures. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 79 / 96
  93. Image sharpening Original (13.69) LLF (25.09) MF with linear Our

    MF with mask coefficients (1, 1.25, 2.5) (α = 30) (24.21) and without mask (20.19) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 81 / 96
  94. Mesh sharpening Original Unsharp Masking Our MF with mask (α

    = 20) (24.33) (30.69) (32.69) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 82 / 96
  95. Mesh sharpening Original Unsharp Masking Our MF with mask (α

    = 20) (12.49) (15.30) (17.52) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 82 / 96
  96. 1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs

    4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 83 / 96
  97. Mathematical Morphology: Continuous formulation Continuous scale morphology defines flat erosion

    and dilation of a function f0 : Ω ⊂ R2 → R by structuring sets B = {z ∈ R2 : ∥z∥p ≤ 1} with the general Partial Differential Equations that describes an evolution equation ∂f ∂t = ∂tf = ±∥∇f∥p Solution of f(x, y, t) at time t > 0 provides dilation (with the plus sign) or erosion (with the minus sign) within a structuring element of size n∆t: δ(f) = ∂tf = +∥∇f∥p and ϵ(f) = ∂tf = −∥∇f∥p Dilation of a single point with a size of 100∆t, ∆t = 0.25 and p = 1, p = 2, and p = ∞. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 84 / 96
  98. Morphological difference operators on graphs We introduce morphological (or upwind)

    difference operators (weighted directional operators): (d+ w f)(vi, vj)=w(vi, vj)1/2 max f(vi), f(vj) −f(vi) and (d− w f)(vi, vj)=w(vi, vj)1/2 f(vi)− min f(vi), f(vj) , (26) with the following properties (always positive) (d+ w f)(vi, vj)= max 0, (dwf)(vi, vj) (d− w f)(vi, vj)= − min 0, (dwf)(vi, vj) with the associated internal and external gradients: (∇± w f)(vi ) = [(d± w f)(vi, vj) : ∀vj ∈ V]T . Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 85 / 96
  99. MM: Transcription on graphs Transcription on graphs Given G =

    (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96
  100. MM: Transcription on graphs Transcription on graphs Given G =

    (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p A⊂V Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96
  101. MM: Transcription on graphs Transcription on graphs Given G =

    (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p - - - - - + + + + + + + + + - + A⊂V ∂+A = {vi / ∈A : ∃vj ∈A with eij ∈E} ∂−A = {vi ∈A : ∃vj / ∈A with eij ∈E} Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96
  102. MM: Transcription on graphs Transcription on graphs Given G =

    (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p - - - - - + + + + + + + + + - + A⊂V ∂+A = {vi / ∈A : ∃vj ∈A with eij ∈E} ∂−A = {vi ∈A : ∃vj / ∈A with eij ∈E} Dilation: adding vertices from ∂+A to A Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96
  103. MM: Transcription on graphs Transcription on graphs Given G =

    (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p - - - - - + + + + + + + + + - + A⊂V ∂+A = {vi / ∈A : ∃vj ∈A with eij ∈E} ∂−A = {vi ∈A : ∃vj / ∈A with eij ∈E} Dilation: adding vertices from ∂+A to A Erosion: removing vertices from ∂−A to A Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96
  104. MM: Transcription on graphs PDE MM: δ : ∂tf(x, t)

    = +∥∇f(x, t)∥p ϵ : ∂tf(x, t) = −∥∇f(x, t)∥p Transcription on graphs Given G = (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p Since we can prove that for any level fl of f, we have: ∥(∇wfl)(vi )∥p= ∥(∇+ w fl)(vi )∥p if vi ∈ ∂+Al, ∥(∇− w fl)(vi )∥p if vi ∈ ∂−Al. (27) Iterative algorithms with discretization in time: f0 : V → R, f(n)(vi)≈f(vi, n∆t) f(n+1)(vi)=f(n)(vi)±∆t∥(∇± w f(n))(vi )∥p f0(vi)=f0(vi) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 87 / 96
  105. Example: adaptive image MM processing Dilation Closing Algebraic PDE Dilation

    Closing Weighted Non local patch Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 88 / 96
  106. Example: image database MM processing f0 : V → IR256

    Dilation Erosion Opening k-NNG Initial Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 89 / 96
  107. Front evolution on a graph ▶ An evolving front Γ

    evolving on a graph G is defined as a subset Ω0 ⊂ V ▶ It is implicitly represented at time t by a level set function ft : V → {−1, +1} defined by ft = χΩt − χ Ωt ▶ The evolution of the front Γ is influenced by a speed function F : V → R ▶ Depending on the sign of F it adds or removes nodes from Ωt ▶ This is described by δf(vi, t) δt = F(vi)∥(∇wf)(vi, t)∥p p (28) ▶ This evolution equation can then be expressed as a combination of two morphological erosion and dilation processes as : δf(vi, t) δt = max F(vi, t), 0 ∥(∇+ w f)(vi)∥p p + min F(vi, t), 0 ∥(∇− w f)(vi)∥p p (29) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 90 / 96
  108. Graph Signal Active contours Proposed adaptation on graphs of geometric

    and without edges actives contours Our adaptation ▶ Given f a level set function ft : V → {−1, +1} for inside/outside of the evolving propagating front, the solution is obtained by ft+1(vi ) = ft(vi ) + ∆tδf(vi,t) δt ▶ We express front propagation on graphs as δf(vi,t) δt = F(vi , t)∥(∇w f)(vi , t)∥p p with F(vi , t) a speed function ▶ We propose a front propagation function that solves the considered active contours with discrete calculus: F(vi , t) = νg(vi ) + µg(vi )(κw f)(vi , t) − λ1 d vi d2(Ff0 ρ (vi ), Fc1 ρ ) + λ2 d vi d2(Ff0 ρ (vi ), Fc2 ρ ) ▶ We consider local patches Ff0 ρ (vj ) on a ρ-hop subgraph to represent the regions (instead of vertex-based signal average) ▶ The potential function g(vi ) differentiates the most salient structures of a graph using patches comparison: as in MM enhancement Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 92 / 96
  109. Results Grid Graph Signals (a) (b) (c) (d) (e) (f)

    Figure: From left to right: (a) Original image, (b) Checkerboard initialization, (c) GSAC; g(vi ) = 1, ρ = 0, (d) g(vi ), (e) GSAC; g(vi ), ρ = 0, (f) GSAC; g(vi ), ρ = 1. ➠ Demo Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 93 / 96
  110. Results 3D Colores Meshes (a) (b) (c) (d) (e) (f)

    (g) (h) Figure: From top to bottom, left to right : (a) Original mesh, (b) g(vi) (inverted) (c) Checkerboard initialization, (d) GSAC; g(vi), ρ = 0, (e) GSAC; g(vi), ρ = 2, (f) manual initialization (g) extracted region with GSAC; g(vi), ρ = 2, (h) re-colorisation of the extracted region. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 94 / 96
  111. Results Image Dataset Graph Figure: Classification of a subset (digits

    0 and 3) of the MNIST dataset. The colors around each image show the class it is affected to. The top row shows the initialization and bottom second row the final classification. 1 2 3 4 5 6 7 8 9 98.8 95.8 97 97.05 90.7 95.55 96.75 95.95 96.25 Table: Classification scores for the 0 digit versus each other digit of the MNIST database. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 95 / 96
  112. The end [thank you] Any Questions ? [email protected] https://lezoray.users.greyc.fr Olivier

    L´ ezoray Graph signal processing : from images to arbitrary graphs 96 / 96