Slide 1

Slide 1 text

GRAPH SIGNAL PROCESSING : FROM IMAGES TO ARBITRARY GRAPHS Olivier L´ ezoray Normandie Univ, UNICAEN, ENSICAEN, CNRS, GREYC, Caen, FRANCE [email protected] https://lezoray.users.greyc.fr

Slide 2

Slide 2 text

1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs 4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 2 / 96

Slide 3

Slide 3 text

1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs 4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 3 / 96

Slide 4

Slide 4 text

The data deluge - Graphs everywhere With the data deluge, graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96

Slide 5

Slide 5 text

The data deluge - Graphs everywhere With the data deluge, graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Images (grid graphs), Image partitions (superpixels graphs) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96

Slide 6

Slide 6 text

The data deluge - Graphs everywhere With the data deluge, graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Meshes, 3D point clouds Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96

Slide 7

Slide 7 text

The data deluge - Graphs everywhere With the data deluge, graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Social Networks: Facebook, LinkedIn Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96

Slide 8

Slide 8 text

The data deluge - Graphs everywhere With the data deluge, graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Biological Networks, Brain Graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96

Slide 9

Slide 9 text

The data deluge - Graphs everywhere With the data deluge, graphs are everywhere: we are witnessing the rise of graphs in Big Data. Graphs occur as a very natural way of representing arbitrary data by modeling the neighborhood properties between these data. Mobility networks : NYC Taxi, Velo’V Lyon Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 4 / 96

Slide 10

Slide 10 text

Graph signals ▶ We consider discrete domains Ω (2D: images, 3D: meshes, nD: manifolds) represented by graphs G = (V, E) carrying multivariate signals f : G → Rn ▶ Graphs can be oriented or undirected, and carry weights on edges. Their topology is arbitrary. f1 : G1 → Rn=3 f2 : G2 → Rn=3 f3 : G3 → Rn=21×21 f4 : G4 → Rn=∗ Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 5 / 96

Slide 11

Slide 11 text

Scientific issues Usual ways of processing data from graphs ▶ Graph theory ➲ spectral analysis (for data processing : proximity graphs built from data) ▶ Variational and morphological methods (for signal and image processing: Euclidean graphs imposed by the domain) ▶ Emergence of a new research field called Graph Signal Processing (GSP) ▶ Objective: development of algorithms to process data that reside on the vertices (or edges) of a graph: signals on graphs ▶ Problem: how to process general (non-Euclidean) graphs with signal processing techniques? ▶ Many recent works aim at extending signal and image processing tools to graph-based signal processing: the same algorithm for any kind of graph signal ! Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 6 / 96

Slide 12

Slide 12 text

Graph signal processing: a very active field R´ ef´ erences ▶ David I. Shuman, Sunil K. Narang, Pascal Frossard, Antonio Ortega, Pierre Vandergheynst, The Emerging Field of Signal Processing on Graphs: Extending High-Dimensional Data Analysis to Networks and Other Irregular Domains. IEEE Signal Process. Mag. 30(3): 83-98, 2013. ▶ A. Ortega, P. Frossard, J. Kovaˇ cevi´ c, J. M. F. Moura and P. Vandergheynst, Graph Signal Processing: Overview, Challenges, and Applications, Proceedings of the IEEE, 106(5): 808-828, 2018. ▶ W. Hu, J. Pang, X. Liu, D. Tian, C. -W. Lin and A. Vetro, Graph Signal Processing for Geometric Data and Beyond: Theory and Applications, in IEEE Transactions on Multimedia, 2021. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 7 / 96

Slide 13

Slide 13 text

What is it for? Problems : from low to high levels ▶ Compression: ➲ Wavelets for signals on graphs ▶ Completion: ➲ Inpainting of signals on graphs ▶ Denoising : ➲ Filtering of signals on graphs ▶ Manipulation: ➲ Enhancement of signals on graphs ▶ Segmentation: ➲Partitioning of signals on graphs ▶ Classification: ➲ recognize graph signals types Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 8 / 96

Slide 14

Slide 14 text

1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs 4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 9 / 96

Slide 15

Slide 15 text

Image filtering The basic ingredient of image filtering: convolution (i-1,j-1) (i,j-1) (i+1,j-1) (i-1,j) (i,j) (i+1,j) (i-1,j+1) (i,j+1) (i+1,j+1) I′(i, j) = v di=−v v dj=−v I(i + di, j + dj) ⊗ filter(di, dj) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 10 / 96

Slide 16

Slide 16 text

Same weights everywhere Apply a filter with fixed weights Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 11 / 96

Slide 17

Slide 17 text

Data-adaptive weights The weights depend on the image content Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 12 / 96

Slide 18

Slide 18 text

How to define these weights ? Classical Gaussian filters: Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 13 / 96

Slide 19

Slide 19 text

How to define these weights ? Data-dependent weights: Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 14 / 96

Slide 20

Slide 20 text

Bilateral filtering Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 15 / 96

Slide 21

Slide 21 text

Examples of data-dependent filters The bilateral and the non-local means Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 16 / 96

Slide 22

Slide 22 text

Local or non-local filters Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 17 / 96

Slide 23

Slide 23 text

The image as a graph ▶ The matrix W is a representation of the image as a graph ▶ The construction of the graph implies local or non-local filtering Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 18 / 96

Slide 24

Slide 24 text

The image as a graph ▶ The matrix W is a representation of the image as a graph ▶ The construction of the graph implies local or non-local filtering Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 18 / 96

Slide 25

Slide 25 text

The image as a graph ▶ The matrix W is a representation of the image as a graph ▶ The construction of the graph implies local or non-local filtering Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 18 / 96

Slide 26

Slide 26 text

Image filtering ▶ Filtering a noisy image : ▶ The clean image can be recovered by kernel regression (Takeda, Farsiu, Milanfar, ’07): ˆ zi = arg minzi n j=1 (yj − zi)2Kij ▶ Depending on the kernel K, a local or non-local filtering is performed ▶ The solution is obtained by a Nadaraya-Watson Kernel regression : ˆ zi = j Kij j Kij yj = wT i y Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 19 / 96

Slide 27

Slide 27 text

Link with the Graph Laplacian ▶ The graph Laplacian at a pixel zi is defined as: L(zi) = j Kij(zi − zj) This measures the image smoothness at zi ▶ And can be rewritten as: L(zi) = zi j Kij − j Kijzj ▶ Enforce smoothness : L(zi) = 0 → zi = j Kij zj j Kij = wT i z P. Milanfar, ”A Tour of Modern Image Filtering”, 2013 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 20 / 96

Slide 28

Slide 28 text

Image Processing & Graphs ▶ Image filtering can be seen as a data-dependent kernel regression on a graph ▶ The topology of the graph implies local or non-local processing ▶ The weights of the graph imply a data-dependent processing ▶ However, images are very specific data organized in a grid on a Euclidean domain ▶ Can we generalize this to signals defined on non-Euclidean domains ? ▶ Can we also generalize other image processing tasks (segmentation, interpolation) to signals on arbitrary graphs ? Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 21 / 96

Slide 29

Slide 29 text

1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs 4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 22 / 96

Slide 30

Slide 30 text

Weighted graphs Basics ▶ A weighted graph G = (V, E, w) consists in a finite set V = {v1 , . . . , vN } of N vertices Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 23 / 96

Slide 31

Slide 31 text

Weighted graphs Basics ▶ A weighted graph G = (V, E, w) consists in a finite set V = {v1 , . . . , vN } of N vertices ▶ and a finite set E = {e1 , . . . , eN′ } ⊂ V × V of N′ weighted edges. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 23 / 96

Slide 32

Slide 32 text

Weighted graphs Basics ▶ A weighted graph G = (V, E, w) consists in a finite set V = {v1 , . . . , vN } of N vertices ▶ and a finite set E = {e1 , . . . , eN′ } ⊂ V × V of N′ weighted edges. ▶ eij = (vi , vj ) is the edge of E that connects vertices vi and vj of V. Its weight, denoted by wij = w(vi , vj ), represents the similarity between its vertices. ▶ Similarities are usually computed by using a positive symmetric function w : V × V → R+ satisfying w(vi , vj ) = 0 if (vi , vj ) / ∈ E. w w w w w w w w w w w Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 23 / 96

Slide 33

Slide 33 text

Weighted graphs Basics ▶ A weighted graph G = (V, E, w) consists in a finite set V = {v1 , . . . , vN } of N vertices ▶ and a finite set E = {e1 , . . . , eN′ } ⊂ V × V of N′ weighted edges. ▶ eij = (vi , vj ) is the edge of E that connects vertices vi and vj of V. Its weight, denoted by wij = w(vi , vj ), represents the similarity between its vertices. ▶ Similarities are usually computed by using a positive symmetric function w : V × V → R+ satisfying w(vi , vj ) = 0 if (vi , vj ) / ∈ E. ▶ The notation vi ∼ vj is used to denote two adjacent vertices. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 23 / 96

Slide 34

Slide 34 text

Space of functions on Graphs ▶ H(V) and H(E) are the Hilbert spaces of graph signals: real-valued functions defined on the vertices or the edges of a graph G. ▶ A function f : V → R of H(V) assigns a real value xi = f(vi) to vi ∈ V. ▶ By analogy with functional analysis on continuous spaces, the integral of a function f ∈ H(V), over the set of vertices V, is defined as V f = V f ▶ Both spaces H(V) and H(E) are endowed with the usual inner products: ⟨f, h⟩H(V) = vi∈V f(vi)h(vi), where f, h : V → R ⟨F, H⟩H(E) = vi∈V vj∼vi F(vi, vj)H(vi, vj) where F, H : E → R Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 24 / 96

Slide 35

Slide 35 text

Difference operators on weighted graphs ➲ Discrete analogue on graphs of classical continuous differential geometry. The difference operator of f, dw : H(V) → H(E), is defined on an edge eij = (vi, vj) ∈ E by: (dwf)(eij) = (dwf)(vi, vj) = w(vi, vj)1/2(f(vj) − f(vi)) . (1) The adjoint of the difference operator, d∗ w : H(E) → H(V), is a linear operator defined by ⟨dwf, H⟩H(E) = ⟨f, d∗ w H⟩H(V) and expressed by (d∗ w H)(vi) = −divw(H)(vi) = vj∼vi w(vi, vj)1/2(H(vj, vi) − H(vi, vj)) . (2) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 25 / 96

Slide 36

Slide 36 text

Difference operators on weighted graphs The directional derivative (or edge derivative) of f, at a vertex vi ∈ V, along an edge eij = (vi, vj), is defined as ∂f ∂eij vi = ∂vj f(vi) = (dwf)(vi, vj) = w(vi, vj)1/2(f(vj) − f(vi)) Weighted finite difference correspond to forward differences on grid-graph images with w(vi, vj) = 1 h2 with h2 the discretization step: ∂f ∂x (vi) = f(vi + h2) − f(vi) h2 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 26 / 96

Slide 37

Slide 37 text

Weighted gradient operator The weighted gradient operator of a function f ∈ H(V), at a vertex vi ∈ V, is the vector operator defined by (∇wf)(vi ) = [(dwf)(vi, vj) : vj ∈ V]T . (3) ➲ The gradient considers all vertices vj ∈ V and not only vj ∼ vi. The Lp norm of this vector represents the local variation of the function f at a vertex of the graph (It is a semi-norm for p ≥ 1): ∥(∇wf)(vi )∥p = vj∼vi wp/2 ij f(vj)−f(vi) p 1/p . (4) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 27 / 96

Slide 38

Slide 38 text

Isotropic p-Laplacian The weighted p-Laplace isotropic operator of a function f ∈ H(V), noted ∆i w,p : H(V) → H(V), is defined by: (∆i w,p f)(vi) = 1 2 d∗ w (∥(∇wf)(vi )∥p−2 2 (dwf)(vi, vj)) . (5) The isotropic p-Laplace operator of f ∈ H(V), at a vertex vi ∈ V, can be computed by: (∆i w,p f)(vi) = 1 2 vj∼vi (γi w,p f)(vi, vj)(f(vi) − f(vj)) , (6) with (γi w,p f)(vi, vj) = wij ∥(∇wf)(vj )∥p−2 2 + ∥(∇wf)(vi )∥p−2 2 . (7) The p-Laplace isotropic operator is nonlinear, except for p = 2 (corresponds to the combinatorial Laplacian). For p = 1, it corresponds to the weighted curvature of the function f on the graph. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 28 / 96

Slide 39

Slide 39 text

Anisotropic p-Laplacian The weighted p-Laplace anisotropic operator of a function f ∈ H(V), noted ∆a w,p : H(V) → H(V), is defined by: (∆a w,p f)(vi) = 1 2 d∗ w (|(dwf)(vi, vj)|p−2(dwf)(vi, vj)) . (8) The anisotropic p-Laplace operator of f ∈ H(V), at a vertex vi ∈ V, can be computed by: (∆a w,p f)(vi) = vj∼vi (γa w,p f)(vi, vj)(f(vi) − f(vj)) . (9) with (γa w,p f)(vi, vj) = wp/2 ij |f(vi) − f(vj)|p−2 . (10) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 29 / 96

Slide 40

Slide 40 text

Constructing graphs Any discrete domain can be modeled by a weighted graph where each data point is represented by a vertex vi ∈ V. Unorganized data An unorganized set of points V ⊂ Rn can be seen as a function f0 : V → Rm. The set of edges is defined by modeling the neighborhood of each vertex based on similarity relationships between feature vectors. Typical graphs: k-nearest neighbors graphs and ϵ-neighborhood graphs. Organized data Typical cases of organized data are signals, gray-scale or color images (in 2D or 3D). The set of edges is defined by spatial relationships. Such data can be seen as functions f0 : V ⊂ Zn → Rm. Typical graphs: images’ grid graphs, region graphs, 3D meshes. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 30 / 96

Slide 41

Slide 41 text

Weighting graphs For an initial function f0 : V → Rm, similarity relationship between data can be incorporated within edges weights according to a measure of similarity g : E → [0, 1] with w(eij) = g(eij), ∀eij ∈ E. Each vertex vi is associated with a feature vector Ff0 τ : V → Rm×q where q corresponds to this vector size: Ff0 τ (vi) = f0(vj) : vj ∈ Nτ (vi) ∪ {vi } T (11) with Nτ (vi) = vj ∈ V \ {vi } : µ(vi, vj) ≤ τ . For an edge eij and a distance measure ρ : Rm×q×Rm×q → R associated to Ff0 τ , we can have: g1(eij) =1 (unweighted case) , g2(eij) = exp −ρ Ff0 τ (vi), Ff0 τ (vj) 2/σ2 with σ > 0 , g3(eij) =1/ 1 + ρ Ff0 τ (vi), Ff0 τ (vj) (12) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 31 / 96

Slide 42

Slide 42 text

Graph topology Digital Image

Slide 43

Slide 43 text

Graph topology Digital Image 8-neighborhood: 3 × 3

Slide 44

Slide 44 text

Graph topology Digital Image 8-neighborhood: 3 × 3 24-neighborhood: 5 × 5

Slide 45

Slide 45 text

Graph topology Digital Image 8-neighborhood: 3 × 3 24-neighborhood: 5 × 5 A value is associated to vertices

Slide 46

Slide 46 text

Graph topology Digital Image 8-neighborhood: 3 × 3 24-neighborhood: 5 × 5 A value is associated to vertices A patch is the vector of values in a given neighborhood of a ver- tex. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 32 / 96

Slide 47

Slide 47 text

With Graphs With Graphs ▶ Nonlocal behavior is directly expressed by the graph topology. ▶ Patches are used to measure similarity between vertices. Consequences ▶ Nonlocal processing of images becomes local processing on similarity graphs. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 33 / 96

Slide 48

Slide 48 text

With Graphs Examples of graphs for an image. From left to right: original image, a symmetric 8-grid graph, 10-nearest neighbor graphs (inside a 11 × 11 window with color-based or 3 × 3 patch-based distances). Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 34 / 96

Slide 49

Slide 49 text

1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs 4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 35 / 96

Slide 50

Slide 50 text

p-Laplacian nonlocal regularization on graphs Let f0 : V → R be the noisy version of a clean graph signal g : V → R defined on the vertices of a weighted graph G = (V, E, w). To recover g, seek for a function f : V → R regular enough on G, and close enough to f0, with the following variational problem: g ≈ min f:V→R E∗ w,p (f, f0, λ) = R∗ w,p (f) + λ 2 ∥f − f0∥2 2 , (13) where the regularization functional R∗ w,p : H(V) → R can correspond to an isotropic Ri w,p or an anisotropic Ra w,p functionnal. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 36 / 96

Slide 51

Slide 51 text

Isotropic and anisotropic regularization terms The isotropic regularization functionnal Ri w,p is defined by the L2 norm of the gradient and is the discrete p-Dirichlet form of the function f ∈ H(V): Ri w,p (f) = 1 p vi∈V ∥(∇wf)(vi )∥p 2 = 1 p ⟨f, ∆i w,p f⟩H(V) = 1 p vi∈V   vj∼vi wij(f(vj) − f(vi))2   p 2 . (14) The anisotropic regularization functionnal Ra w,p is defined by the Lp norm of the gradient: Ra w,p (f) = 1 p vi∈V ∥(∇wf)(vi )∥p p = 1 p ⟨f, ∆a w,p f⟩H(V) = 1 p vi∈V vj∼vi wp/2 ij |f(vj) − f(vi)|p . (15) When p ≥ 1, the energy E∗ w,p is a convex functional of functions of H(V). Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 37 / 96

Slide 52

Slide 52 text

Isotropic/Anisotropic diffusion processes To get the solution of the minimizer, we consider the following system of equations: ∂E∗ w,p (f, f0, λ) ∂f(vi) = 0, ∀vi ∈ V (16) which is rewritten as: ∂R∗ w,p (f) ∂f(vi) + λ(f(vi) − f0(vi)) = 0, ∀vi ∈ V. (17) Moreover, we can prove that ∂Ri w,p (f) ∂f(vi) = 2(∆i w,p f)(vi) and ∂Ra w,p (f) ∂f(vi) = (∆a w,p f)(vi) . (18) The system of equations is then rewritten as  λ + vj∼vi (γ∗ w,p f)(vi, vj)   f(vi) − vj∼vi (γ∗ w,p f)(vi, vj)f(vj) = λf0(vi). (19) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 38 / 96

Slide 53

Slide 53 text

Isotropic/Anisotropic diffusion processes We can use the linearized Gauss-Jacobi iterative method to solve the previous systems. Let n be an iteration step, and let f(n) be the solution at the step n. Then, the method is given by the following algorithm:            f(0) = f0 f(n+1)(vi) = λf0(vi) + vj∼vi (γ∗ w,p f(n))(vi, vj)f(n)(vj) λ + vj∼vi (γ∗ w,p f(n))(vi, vj) , ∀vi ∈ V. (20) with (γi w,p f)(vi, vj) = wij ∥(∇wf)(vj )∥p−2 2 + ∥(∇wf)(vi )∥p−2 2 , (21) and (γa w,p f)(vi, vj) = wp/2 ij |f(vi) − f(vj)|p−2 . (22) It describes a family of discrete diffusion processes, which is parameterized by the structure of the graph (topology and weight function), the parameter p, and the parameter λ. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 39 / 96

Slide 54

Slide 54 text

Different kinds of regularizers Depending on p we can obtain different kinds of regularizers. Let’s start with p = 2 ▶ Ri w,2 (f) = 1 2 vi∈V vj∼vi wij(f(vj) − f(vi))2 = fT Lf This is the Graph Laplacian Regularizer (GLR) with L = D − W. ▶ The signal is smooth if the GLR is small : for large edge weights, f(vj) and f(vi) are similar. For small edge weight, they can differ signicatively. ▶ Also possesses an interpretation in the frequency domain : fT Lf = N k=1 λk ˆ f2 k where λk is the k-th eigenvalue and ˆ fk = VT f the k-th GFT (Graph Fourier Transform) coefficient (L = VΣVT ). The GLR is small if the signal energy is occupied by low-frequency components. ▶ The regularization is linear has a closed form solution: f∗ = (λI + L)−1f (Zhou, Sch¨ olkopf, ’06) it performs adaptive low-pass filtering. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 40 / 96

Slide 55

Slide 55 text

Different kinds of regularizers Now p = 1 ! ▶ Ri w,1 (f) = vi∈V vj∼vi wij(f(vj) − f(vi))2 1 2 Called the isotropic Graph Total Variation (GTV) ▶ Ra w,1 (f) = vi∈V vj∼vi w1/2 ij |f(vj) − f(vi)| Called the anisotropic Graph Total Variation (GTV) ▶ The GTV is a stronger piecewise smooth prior than the GLR ▶ No closed form-solution as the GTV is non-differentiable ▶ Can be solved more efficiently than with the iterative Gauss-Jacobi process with primal dual algorithms (Chambolle-Pock, ADMM) or the Cut-Pursuit (to be presented) Note : In the GLR and GTV, the graph weights are fixed. There exists other priors where the weights are updated during the minimization (from works of Gene Cheung), called Reweighted GLR and GTV. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 41 / 96

Slide 56

Slide 56 text

Examples: Image denoising Original image Noisy image (Gaussian noise with σ = 15) f0 : V → R3 PSNR=29.38dB Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 42 / 96

Slide 57

Slide 57 text

Examples: Image denoising Isotropic G1, Ff0 0 = f0 Isotropic G7, Ff0 3 Anisotropic G7, Ff0 3 p = 2 PSNR=28.52db PSNR=31.79dB PSNR=31.79dB p = 1 PSNR=31.25dB PSNR=34.74dB PSNR=31.81dB Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 43 / 96

Slide 58

Slide 58 text

Examples: Mesh simplification Original Mesh Isotropic, p = 2 Isotropic, p = 1, Anisotropic, p = 1 f0 : V → R3 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 44 / 96

Slide 59

Slide 59 text

Examples: Colored Mesh simplification Original Colored Mesh λ = 1 λ = 0.5 f0 : V → R3 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 45 / 96

Slide 60

Slide 60 text

Examples: Point cloud denoising 2D Patches on 3D Point clouds Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 46 / 96

Slide 61

Slide 61 text

Examples: Point Cloud denoising Initial Point cloud Noisy Spectral low-pass filter Nodal GTV filtering Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 47 / 96

Slide 62

Slide 62 text

Examples: Colored Point Cloud denoising Initial Point cloud Noisy Local Graph Non Local Graph f0 : V → R3 4-NNG 200-NNG, Ff0 9 127039 points Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 48 / 96

Slide 63

Slide 63 text

Examples: Mesh denoising The anisotropic GTV prior of surface normals is used over a 80-NN graph with Gaussian weights on vertices coordinates. Original noisy denoised Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 49 / 96

Slide 64

Slide 64 text

Examples: Image Database denoising Initial data Noisy data 10-NNG f0 : V → R16×16 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 50 / 96

Slide 65

Slide 65 text

Examples: Image Database denoising λ = 1 λ = 0.01 λ = 0 Isotropic p = 1 PSNR=18.80dB PSNR=13.54dB PSNR=10.52dB Anisotropic p = 1 PSNR=18.96dB PSNR=15.19dB PSNR=14.41dB Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 51 / 96

Slide 66

Slide 66 text

Hierarchical decomposition of graph signals ▶ The graph signal is decomposed into a base layer and several detail layers capturing each a given level of detail : f = l−2 i=0 fi + dl−1 ▶ Each layer is obtained by fi = GTVR(di1 ), d−1 = f, and di = di−1 − fi ▶ The sequence of scales is decreaseing λ0 < λ1 < · · · < λl−2 ▶ The signal can be reconstructed from the hierarchical decomposition with linear manipulation of the layers ˆ f(vk) = f0(vk) + l−1 i=1 αifi(vk) with fl−1 = dl−1 ▶ Each layer is boosted is αi > 1. ▶ This is an extension of a hierarchical framework that was proposed by (Tadmor, Nezzar, Vese, ’04) to graph signals on arbitrary graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 52 / 96

Slide 67

Slide 67 text

Exemple of decomposition Nonlocal decomposition with a 10-NNG, Ff0 2 . Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 53 / 96

Slide 68

Slide 68 text

Hierarchical decomposition by iterative regularization Original Image Removing layers u1 to u3 and u6 to u9 removes acne removes freckles Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 54 / 96

Slide 69

Slide 69 text

Mesh enhancement Original Mesh Coarse Mesh Intermediate Mesh Enhanced Mesh Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 55 / 96

Slide 70

Slide 70 text

High Quality Colored Mesh enhancement 553053 vertices, 1105611 faces Original scan Enhanced scan Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 56 / 96

Slide 71

Slide 71 text

Interpolation of missing data on graphs Let f0 : V0 → R be a function with V0 ⊂ V be the subset of vertices from the whole graph with known values. The interpolation consists in recovering values of f for the vertices of V \ V0 given values for vertices of V0 formulated by: min f:V→R R∗ w,p (f) + λ(vi)∥f(vi) − f0(vi)∥2 2 . (23) Since f0(vi) is known only for vertices of V0, the Lagrange parameter is defined as λ : V → R: λ(vi) = λ if vi ∈ V0 0 otherwise. (24) This comes to consider ∆∗ w,p f(vi) = 0 on V \ V0. Isotropic and anisotropic diffusion processes can be directly used to perform the interpolation. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 57 / 96

Slide 72

Slide 72 text

Examples: Image segmentation Solve ∆∗ w,p f(vi) = 0 on V \ V0. (a) 27 512 pixels (b) Original+Labels (c) t = 50 (11 seconds) (d) 639 zones (98% of reduc- tion) (e) Original+Labels (f) t = 5 (< 1 second) (g) 639 zones (98% of reduc- tion) (h) Original+Labels (i) t = 2 (< 1 second) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 58 / 96

Slide 73

Slide 73 text

Examples: Data base clustering (a) Données initiales avec les marqueurs initiaux (b) 20-Nng n = 1 n = 5 n = 10 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 Fig. 3.16 – Exemple de propagation de labels sur un nuage de points pour la clas- sification semi supervisée de données. (a) : données initiales avec les marqueurs initiaux et (b) : le graphe associé. À la dernière ligne : évolution de la classification en fonction du nombre d’itérations n. Fig. 3.17 – Image initiale avec marqueurs initiaux servant d’image de test pour la segmentation semi supervisée non locale isotrope et anisotrope présentée dans la figure 3.18. Nos modèles isotrope et anisotrope de la classification semi supervisée sont défi- nis avec des graphes et unifient configurations locales et non locales dans le contexte du traitement des images. Nous pouvons remarquer que la segmentation non locale avec des patchs est peu étudiée et utilisée dans la littérature. Nous pouvons citer quelques travaux récents définis dans le domaine continu [?,?]. La figure 3.18 illustre ce type de segmentation des images et compare les mo- dèles isotrope et anisotrope avec différents graphes et différentes valeurs du para- mètre p (dans le cas où le paramètre p = 2, les modèles isotrope et anisotrope sont 3.3 Problèmes d’interpolation basés sur la régularisation 79 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 (a) Données initiales avec les marqueurs initiaux ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 (b) 20-Nng n = 1 n = 5 n = 10 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 Fig. 3.16 – Exemple de propagation de labels sur un nuage de points pour la clas- sification semi supervisée de données. (a) : données initiales avec les marqueurs initiaux et (b) : le graphe associé. À la dernière ligne : évolution de la classification en fonction du nombre d’itérations n. Fig. 3.17 – Image initiale avec marqueurs initiaux servant d’image de test pour la segmentation semi supervisée non locale isotrope et anisotrope présentée dans la figure 3.18. Nos modèles isotrope et anisotrope de la classification semi supervisée sont défi- nis avec des graphes et unifient configurations locales et non locales dans le contexte du traitement des images. Nous pouvons remarquer que la segmentation non locale avec des patchs est peu étudiée et utilisée dans la littérature. Nous pouvons citer quelques travaux récents définis dans le domaine continu [?,?]. La figure 3.18 illustre ce type de segmentation des images et compare les mo- dèles isotrope et anisotrope avec différents graphes et différentes valeurs du para- mètre p (dans le cas où le paramètre p = 2, les modèles isotrope et anisotrope sont 3.3 Problèmes d’interpolation basés sur la régularisation 79 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 (a) Données initiales avec les marqueurs initiaux ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 (b) 20-Nng n = 1 n = 5 n = 10 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !!! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! !! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 50 100 150 200 250 60 80 100 120 140 160 180 200 Fig. 3.16 – Exemple de propagation de labels sur un nuage de points pour la clas- sification semi supervisée de données. (a) : données initiales avec les marqueurs initiaux et (b) : le graphe associé. À la dernière ligne : évolution de la classification en fonction du nombre d’itérations n. Fig. 3.17 – Image initiale avec marqueurs initiaux servant d’image de test pour la segmentation semi supervisée non locale isotrope et anisotrope présentée dans la figure 3.18. Nos modèles isotrope et anisotrope de la classification semi supervisée sont défi- nis avec des graphes et unifient configurations locales et non locales dans le contexte du traitement des images. Nous pouvons remarquer que la segmentation non locale avec des patchs est peu étudiée et utilisée dans la littérature. Nous pouvons citer quelques travaux récents définis dans le domaine continu [?,?]. La figure 3.18 illustre ce type de segmentation des images et compare les mo- dèles isotrope et anisotrope avec différents graphes et différentes valeurs du para- mètre p (dans le cas où le paramètre p = 2, les modèles isotrope et anisotrope sont Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 59 / 96

Slide 74

Slide 74 text

Examples: Image colorization Gray level image Color scribbles Compute Weights from the gray-level image, interpolation is performed in a chrominance color space from the seeds: fc(vi ) = fs 1 (vi ) fl(vi ) , fs 2 (vi ) fl(vi ) , fs 3 (vi ) fl(vi ) T Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 60 / 96

Slide 75

Slide 75 text

Examples: Image colorization p = 1, G1, Ff0 0 = f0 p = 1, G5, Ff0 2 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 61 / 96

Slide 76

Slide 76 text

Examples: 3D Point Cloud colorization p = 1, G25, Ff0 9 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 62 / 96

Slide 77

Slide 77 text

1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs 4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 63 / 96

Slide 78

Slide 78 text

Introduction - Mathematical Morphology (Algebraic) Fundamental operators in Mathematical Morphology (MM) are dilation and erosion. Dilation δ of a function f0 : Ω ⊂ R2 → R consists in replacing the function value by the maximum value within a structuring element B such that: δB f0(x, y) = max f0(x + x′, y + y′)|(x′, y′) ∈ B Erosion ϵ is computed by: ϵB f0(x, y) = min f0(x + x′, y + y′)|(x′, y′) ∈ B Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 64 / 96

Slide 79

Slide 79 text

Introduction - Complete Lattice ▶ MM needs an ordering relation within vectors: a complete lattice (T , ≤) ▶ MM is problematic for multivariate data since there is no natural ordering for vectors ▶ The framework of h-orderings can be considered for that : construct a mapping h from T to L where L is a complete lattice equipped with the conditional total ordering h : T → L and v → h(v), ∀(vi, vj) ∈ T × T vi ≤h vj ⇔ h(vi) ≤ h(vj) . ▶ ≤h denotes such an h-ordering, it is a dimensionality reduction operation h : Rn → Rp with p < n. ▶ Advantage : the learned lattice depends of the signal content and is more adaptive. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 65 / 96

Slide 80

Slide 80 text

Manifold-based ordering × Problem : the projection operator h cannot be linear since a distortion of the space is inevitable ! Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 66 / 96

Slide 81

Slide 81 text

Manifold-based ordering × Problem : the projection operator h cannot be linear since a distortion of the space is inevitable ! ✓ Solution : Consider non-linear dimensionality reduction with Laplacian Eigenmaps that corresponds to learn the manifold where the vectors live. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 66 / 96

Slide 82

Slide 82 text

Laplacian Eigenmaps ▶ Laplacian Eigenmaps intend to embed the data T in a lower dimensional space in such a way that close/similar points in T remain close in the low dimensional space: h : Rn → Rp ▶ The similarity between vectors in the original space is encoded by the graph weights. ▶ Build a k-NN neighborhood graph on T ▶ Assign weights to edges with a Gaussian kernel ▶ Find Y = {y1 , · · · , yN } ∈ Rp that minimizes E(Y) = i,j ∥yi − yj ∥2 2 wij = 2YT LY ▶ The result can be obtained by the eigenvectors of the Laplacian L = D − W (same than spectral clustering) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 67 / 96

Slide 83

Slide 83 text

Laplacian Eigenmaps This maps a graph to a line for each eigenvector Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 68 / 96

Slide 84

Slide 84 text

Manifold-based ordering × Problem : LE Non-linear dimensionality reduction directly on the set T of vectors is not tractable in reasonable time ! : Eigenvector decomposition of L is O(N3) with N the number of vertices. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 69 / 96

Slide 85

Slide 85 text

Manifold-based ordering × Problem : LE Non-linear dimensionality reduction directly on the set T of vectors is not tractable in reasonable time ! : Eigenvector decomposition of L is O(N3) with N the number of vertices. ✓ Solution : Consider a more efficient strategy. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 69 / 96

Slide 86

Slide 86 text

Manifold-based ordering × Problem : LE Non-linear dimensionality reduction directly on the set T of vectors is not tractable in reasonable time ! : Eigenvector decomposition of L is O(N3) with N the number of vertices. ✓ Solution : Consider a more efficient strategy. Three-Step Strategy ▶ Dictionary Learning to produce a set D from the set of initial vectors T ▶ Laplacian Eigenmaps Manifold Learning on the dictionary D to obtain a projection operator hD ▶ Out of sample extension to extrapolate hD to T and define h Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 69 / 96

Slide 87

Slide 87 text

Graph signal representation Given the complete lattice (T , ≤h), a sorted permutation P of T is constructed P = {v′ 1 , · · · , v′ m } with v′ i ≤h v′ i+1 , ∀i ∈ [1, (m − 1)]. From the ordering, an index signal I : Ω ⊂ Z2 → [1, m] is defined as: I(pi ) = {k | v′ k = f(pi ) = vi } . Image of 256 colors Index Image (T , ≤h) The pair (I, P) provides a new graph signal representation (the index and the palette of ordered vectors). The original signal f can be directly recovered since f(pi ) = P[I(pi )] = vi. To process the graph signal: g(f(vi)) = P[g(I(vi))] with g an operation. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 70 / 96

Slide 88

Slide 88 text

Algebraic MM for graphs signals The erosion and the dilation of a signal f on a graph au noeud vi ∈ G with a structuring element Bk ⊂ G are: ϵBk (f)(vi) = {P[∧I(vj)], vj ∈ Bk(vi)} δBk (f)(vi) = {P[∨I(vj)], vj ∈ Bk(vi)} . A structuring element Bk(vi) contains the k-hop nodes of vi : Bk(vi) = {vj ∼ vi } ∪ {vi } if k = 1 Bk−1(vi) ∪ ∪∀vl∈Bk−1 (vi ) B1(vl) if k ≥ 2 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 71 / 96

Slide 89

Slide 89 text

Examples of obtained representations Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 72 / 96

Slide 90

Slide 90 text

Examples of obtained representations Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 72 / 96

Slide 91

Slide 91 text

Processing examples Original image f ϵBk (f) δBk (f) γBk (f) = δBk (ϵBk (f)) ϕBk (f) = ϵBk (δBk (f)) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 73 / 96

Slide 92

Slide 92 text

Processing examples Original colored mesh f ϵBk (f) δBk (f) γBk (f) = δBk (ϵBk (f)) ϕBk (f) = ϵBk (δBk (f)) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 73 / 96

Slide 93

Slide 93 text

Image and Mesh abstraction Performed with an OCCO filter. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 74 / 96

Slide 94

Slide 94 text

Image and Mesh abstraction Performed with an OCCO filter. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 74 / 96

Slide 95

Slide 95 text

Morphological Tone Mapping Durand & Dorsey MM Tone Mapping Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 75 / 96

Slide 96

Slide 96 text

Graph signal multi-layer decomposition We propose the following multi-layer morphological decomposition of a graph signal into l layers. The graph signal is decomposed into a base layer and several detail layers, each capturing a given scale of details. d−1 = f, i = 0 while i < l do Compute the graph signal representation at level i − 1: di−1 = (Ii−1, Pi−1) Morphological Filtering of di−1: fi = MFBl−i (di−1) Compute the residual (detail layer): di = di−1 − fi Proceed to next layer: i = i + 1 end while Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 76 / 96

Slide 97

Slide 97 text

Graph signal multi-layer decomposition ▶ The graph signal can then be represented by f = l−2 i=0 fi + dl−1 ▶ To extract the successive layers in a coherent manner, the sequence of scales should be decreasing ▶ ➲ Bl−i is a sequence of structuring elements of decreasing sizes with i ∈ [0, l − 1] ▶ Each detail layer di is computed on a different set of vectors than the previous layer di−1 ▶ ➲ The graph signal representation (Ii, Pi) is computed for the successive layers ▶ The considered Morphological Filter should be suitable for a multi scale analysis ▶ ➲ Use of OCCO filter : OCCOBk (f) = γBk (ϕBk (f))+ϕBk (γBk (f)) 2 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 77 / 96

Slide 98

Slide 98 text

Decomposition examples f f0 f1 f2 f3 d3 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 78 / 96

Slide 99

Slide 99 text

Decomposition examples f f0 f1 f2 f3 d3 Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 78 / 96

Slide 100

Slide 100 text

Graph signal enhancement ▶ The graph signal can be enhanced by manipulating the different layers with specific coefficients and adding the modified layers altogether. ˆ f(vk) = f0(vk) + M(vk) · l−1 i=1 Si(fi(vk)) with fl−1 = dl−1 (25) ▶ Each layer is manipulated by a nonlinear function Si(x) = 1 1+exp(−αi x) for detail enhancement and tone manipulation. ▶ The parameter αi of the sigmoid is automatically determined and decreases while i increases: αi = α i+1 ▶ A structure mask M prevents boosting noise and artifacts while enhancing the main structures. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 79 / 96

Slide 101

Slide 101 text

Structure masks examples Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 80 / 96

Slide 102

Slide 102 text

Image sharpening Original (13.69) LLF (25.09) MF with linear Our MF with mask coefficients (1, 1.25, 2.5) (α = 30) (24.21) and without mask (20.19) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 81 / 96

Slide 103

Slide 103 text

Mesh sharpening Original Unsharp Masking Our MF with mask (α = 20) (24.33) (30.69) (32.69) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 82 / 96

Slide 104

Slide 104 text

Mesh sharpening Original Unsharp Masking Our MF with mask (α = 20) (12.49) (15.30) (17.52) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 82 / 96

Slide 105

Slide 105 text

1. Introduction 2. Image Filtering 3. Discrete Calculus on graphs 4. Graph signal p-Laplacian regularization 5. Mathematical Morphology 6. From continuous MM to Active contours on graphs Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 83 / 96

Slide 106

Slide 106 text

Mathematical Morphology: Continuous formulation Continuous scale morphology defines flat erosion and dilation of a function f0 : Ω ⊂ R2 → R by structuring sets B = {z ∈ R2 : ∥z∥p ≤ 1} with the general Partial Differential Equations that describes an evolution equation ∂f ∂t = ∂tf = ±∥∇f∥p Solution of f(x, y, t) at time t > 0 provides dilation (with the plus sign) or erosion (with the minus sign) within a structuring element of size n∆t: δ(f) = ∂tf = +∥∇f∥p and ϵ(f) = ∂tf = −∥∇f∥p Dilation of a single point with a size of 100∆t, ∆t = 0.25 and p = 1, p = 2, and p = ∞. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 84 / 96

Slide 107

Slide 107 text

Morphological difference operators on graphs We introduce morphological (or upwind) difference operators (weighted directional operators): (d+ w f)(vi, vj)=w(vi, vj)1/2 max f(vi), f(vj) −f(vi) and (d− w f)(vi, vj)=w(vi, vj)1/2 f(vi)− min f(vi), f(vj) , (26) with the following properties (always positive) (d+ w f)(vi, vj)= max 0, (dwf)(vi, vj) (d− w f)(vi, vj)= − min 0, (dwf)(vi, vj) with the associated internal and external gradients: (∇± w f)(vi ) = [(d± w f)(vi, vj) : ∀vj ∈ V]T . Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 85 / 96

Slide 108

Slide 108 text

MM: Transcription on graphs Transcription on graphs Given G = (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96

Slide 109

Slide 109 text

MM: Transcription on graphs Transcription on graphs Given G = (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p A⊂V Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96

Slide 110

Slide 110 text

MM: Transcription on graphs Transcription on graphs Given G = (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p - - - - - + + + + + + + + + - + A⊂V ∂+A = {vi / ∈A : ∃vj ∈A with eij ∈E} ∂−A = {vi ∈A : ∃vj / ∈A with eij ∈E} Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96

Slide 111

Slide 111 text

MM: Transcription on graphs Transcription on graphs Given G = (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p - - - - - + + + + + + + + + - + A⊂V ∂+A = {vi / ∈A : ∃vj ∈A with eij ∈E} ∂−A = {vi ∈A : ∃vj / ∈A with eij ∈E} Dilation: adding vertices from ∂+A to A Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96

Slide 112

Slide 112 text

MM: Transcription on graphs Transcription on graphs Given G = (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p - - - - - + + + + + + + + + - + A⊂V ∂+A = {vi / ∈A : ∃vj ∈A with eij ∈E} ∂−A = {vi ∈A : ∃vj / ∈A with eij ∈E} Dilation: adding vertices from ∂+A to A Erosion: removing vertices from ∂−A to A Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 86 / 96

Slide 113

Slide 113 text

MM: Transcription on graphs PDE MM: δ : ∂tf(x, t) = +∥∇f(x, t)∥p ϵ : ∂tf(x, t) = −∥∇f(x, t)∥p Transcription on graphs Given G = (V, E, w), f0 : V → R, f(., 0) = f, ∀vi ∈ V, we define: δ : ∂tf(vi, t) = +∥(∇+ w f)(vi, t)∥p ϵ : ∂tf(vi, t) = −∥(∇− w f)(vi, t)∥p Since we can prove that for any level fl of f, we have: ∥(∇wfl)(vi )∥p= ∥(∇+ w fl)(vi )∥p if vi ∈ ∂+Al, ∥(∇− w fl)(vi )∥p if vi ∈ ∂−Al. (27) Iterative algorithms with discretization in time: f0 : V → R, f(n)(vi)≈f(vi, n∆t) f(n+1)(vi)=f(n)(vi)±∆t∥(∇± w f(n))(vi )∥p f0(vi)=f0(vi) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 87 / 96

Slide 114

Slide 114 text

Example: adaptive image MM processing Dilation Closing Algebraic PDE Dilation Closing Weighted Non local patch Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 88 / 96

Slide 115

Slide 115 text

Example: image database MM processing f0 : V → IR256 Dilation Erosion Opening k-NNG Initial Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 89 / 96

Slide 116

Slide 116 text

Front evolution on a graph ▶ An evolving front Γ evolving on a graph G is defined as a subset Ω0 ⊂ V ▶ It is implicitly represented at time t by a level set function ft : V → {−1, +1} defined by ft = χΩt − χ Ωt ▶ The evolution of the front Γ is influenced by a speed function F : V → R ▶ Depending on the sign of F it adds or removes nodes from Ωt ▶ This is described by δf(vi, t) δt = F(vi)∥(∇wf)(vi, t)∥p p (28) ▶ This evolution equation can then be expressed as a combination of two morphological erosion and dilation processes as : δf(vi, t) δt = max F(vi, t), 0 ∥(∇+ w f)(vi)∥p p + min F(vi, t), 0 ∥(∇− w f)(vi)∥p p (29) Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 90 / 96

Slide 117

Slide 117 text

Front evolution example Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 91 / 96

Slide 118

Slide 118 text

Graph Signal Active contours Proposed adaptation on graphs of geometric and without edges actives contours Our adaptation ▶ Given f a level set function ft : V → {−1, +1} for inside/outside of the evolving propagating front, the solution is obtained by ft+1(vi ) = ft(vi ) + ∆tδf(vi,t) δt ▶ We express front propagation on graphs as δf(vi,t) δt = F(vi , t)∥(∇w f)(vi , t)∥p p with F(vi , t) a speed function ▶ We propose a front propagation function that solves the considered active contours with discrete calculus: F(vi , t) = νg(vi ) + µg(vi )(κw f)(vi , t) − λ1 d vi d2(Ff0 ρ (vi ), Fc1 ρ ) + λ2 d vi d2(Ff0 ρ (vi ), Fc2 ρ ) ▶ We consider local patches Ff0 ρ (vj ) on a ρ-hop subgraph to represent the regions (instead of vertex-based signal average) ▶ The potential function g(vi ) differentiates the most salient structures of a graph using patches comparison: as in MM enhancement Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 92 / 96

Slide 119

Slide 119 text

Results Grid Graph Signals (a) (b) (c) (d) (e) (f) Figure: From left to right: (a) Original image, (b) Checkerboard initialization, (c) GSAC; g(vi ) = 1, ρ = 0, (d) g(vi ), (e) GSAC; g(vi ), ρ = 0, (f) GSAC; g(vi ), ρ = 1. ➠ Demo Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 93 / 96

Slide 120

Slide 120 text

Results 3D Colores Meshes (a) (b) (c) (d) (e) (f) (g) (h) Figure: From top to bottom, left to right : (a) Original mesh, (b) g(vi) (inverted) (c) Checkerboard initialization, (d) GSAC; g(vi), ρ = 0, (e) GSAC; g(vi), ρ = 2, (f) manual initialization (g) extracted region with GSAC; g(vi), ρ = 2, (h) re-colorisation of the extracted region. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 94 / 96

Slide 121

Slide 121 text

Results Image Dataset Graph Figure: Classification of a subset (digits 0 and 3) of the MNIST dataset. The colors around each image show the class it is affected to. The top row shows the initialization and bottom second row the final classification. 1 2 3 4 5 6 7 8 9 98.8 95.8 97 97.05 90.7 95.55 96.75 95.95 96.25 Table: Classification scores for the 0 digit versus each other digit of the MNIST database. Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 95 / 96

Slide 122

Slide 122 text

The end [thank you] Any Questions ? [email protected] https://lezoray.users.greyc.fr Olivier L´ ezoray Graph signal processing : from images to arbitrary graphs 96 / 96