Slide 1

Slide 1 text

SketchODE: Learning neural sketch representation in continuous time Ayan Das1,2, Yongxin Yang1,3, Timothy Hospedales1,3, Tao Xiang1,2, Yi-Zhe Song1,2 1SketchX, CVSSP, University of Surrey, UK 2iFlyTek-Surrey Joint research centre on AI 3University of Edinburgh, UK Accepted as poster @ ICLR ‘22

Slide 2

Slide 2 text

Chirographic Data: Handwriting, Sketches etc.

Slide 3

Slide 3 text

Chirographic Data: Handwriting, Sketches etc. • Usually represented as sequence of discrete points

Slide 4

Slide 4 text

Chirographic Data: Handwriting, Sketches etc. • Usually represented as sequence of discrete points

Slide 5

Slide 5 text

Chirographic Data: Handwriting, Sketches etc. • Usually represented as sequence of discrete points • Disregards the true nature, which is continuous

Slide 6

Slide 6 text

Chirographic Data: Handwriting, Sketches etc. • Usually represented as sequence of discrete points • Disregards the true nature, which is continuous

Slide 7

Slide 7 text

Chirographic Data: Handwriting, Sketches etc. • Usually represented as sequence of discrete points • Disregards the true nature, which is continuous QuickDraw

Slide 8

Slide 8 text

Chirographic Data: Handwriting, Sketches etc. • Usually represented as sequence of discrete points • Disregards the true nature, which is continuous QuickDraw VectorMNIST* * VectorMNIST (newly introduced), vectorized version of MNIST, check https://ayandas.me/sketchode

Slide 9

Slide 9 text

Representing continuous time strokes • Previous approaches

Slide 10

Slide 10 text

Representing continuous time strokes • Previous approaches • Bezier curves [2] Image taken from [2]

Slide 11

Slide 11 text

Representing continuous time strokes • Previous approaches • Bezier curves [2] • Differential Geometry [1] • etc … [1] Emre Aksan, Thomas Deselaers, Andrea Tagliasacchi, and Otmar Hilliges. Cose: Compositional stroke embeddings. NeurIPS, 2020. [2] Ayan Das, Yongxin Yang, Timothy Hospedales, Tao Xiang, and Yi-Zhe Song. BezierSketch: A generative model for scalable vector sketches. ECCV, 2020. Image taken from [2] Image taken from [1]

Slide 12

Slide 12 text

Representing continuous time strokes • Previous approaches • Bezier curves [2] • Differential Geometry [1] • etc … [1] Emre Aksan, Thomas Deselaers, Andrea Tagliasacchi, and Otmar Hilliges. Cose: Compositional stroke embeddings. NeurIPS, 2020. [2] Ayan Das, Yongxin Yang, Timothy Hospedales, Tao Xiang, and Yi-Zhe Song. BezierSketch: A generative model for scalable vector sketches. ECCV, 2020. Image taken from [2] Image taken from [1] Bezier curve based stroke + Autoregressive Generation

Slide 13

Slide 13 text

Representing continuous time strokes • Previous approaches • Bezier curves [2] • Differential Geometry [1] • etc … [1] Emre Aksan, Thomas Deselaers, Andrea Tagliasacchi, and Otmar Hilliges. Cose: Compositional stroke embeddings. NeurIPS, 2020. [2] Ayan Das, Yongxin Yang, Timothy Hospedales, Tao Xiang, and Yi-Zhe Song. BezierSketch: A generative model for scalable vector sketches. ECCV, 2020. Image taken from [2] Image taken from [1] Diff. Geometry based stroke + Autoregressive Generation Bezier curve based stroke + Autoregressive Generation

Slide 14

Slide 14 text

Entire chirographic structure as one function

Slide 15

Slide 15 text

Entire chirographic structure as one function • Chirographic structures (including pen-up events) as one continuous- time function 𝑠(𝑡) and model it’s derivative using Neural ODE [1] [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. In NeurIPS, 2018.

Slide 16

Slide 16 text

Entire chirographic structure as one function [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. In NeurIPS, 2018. • Chirographic structures (including pen-up events) as one continuous- time function 𝑠(𝑡) and model it’s derivative using Neural ODE [1]

Slide 17

Slide 17 text

Entire chirographic structure as one function • Chirographic structures (including pen-up events) as one continuous- time function 𝑠(𝑡) and model it’s derivative using Neural ODE [1] [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. In NeurIPS, 2018.

Slide 18

Slide 18 text

Entire chirographic structure as one function • Chirographic structures (including pen-up events) as one continuous- time function 𝑠(𝑡) and model it’s derivative using Neural ODE [1] [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. In NeurIPS, 2018.

Slide 19

Slide 19 text

Entire chirographic structure as one function • Chirographic structures (including pen-up events) as one continuous- time function 𝑠(𝑡) and model it’s derivative using Neural ODE [1] [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. In NeurIPS, 2018. Model

Slide 20

Slide 20 text

Entire chirographic structure as one function • Chirographic structures (including pen-up events) as one continuous- time function 𝑠(𝑡) and model it’s derivative using Neural ODE [1] [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. In NeurIPS, 2018. Model Solution/Inference

Slide 21

Slide 21 text

Learn latent representations for continuous time functions

Slide 22

Slide 22 text

Learn latent representations for continuous time functions • Use “Neural CDE” [1] to encoder the data and “Neural ODE” [2] to decode it – an Autoencoder setup. [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. NeurIPS, 2018. [2] Patrick Kidger, James Morrill, James Foster, and Terry J. Lyons. Neural controlled differential equations for irregular time series. NeurIPS, 2020.

Slide 23

Slide 23 text

Learn latent representations for continuous time functions • Use “Neural CDE” [1] to encoder the data and “Neural ODE” [2] to decode it – an Autoencoder setup. [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. NeurIPS, 2018. [2] Patrick Kidger, James Morrill, James Foster, and Terry J. Lyons. Neural controlled differential equations for irregular time series. NeurIPS, 2020.

Slide 24

Slide 24 text

Learn latent representations for continuous time functions • Use “Neural CDE” [1] to encoder the data and “Neural ODE” [2] to decode it – an Autoencoder setup. [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. NeurIPS, 2018. [2] Patrick Kidger, James Morrill, James Foster, and Terry J. Lyons. Neural controlled differential equations for irregular time series. NeurIPS, 2020. Augmented ODE

Slide 25

Slide 25 text

Learn latent representations for continuous time functions • Use “Neural CDE” [1] to encoder the data and “Neural ODE” [2] to decode it – an Autoencoder setup. [1] Tian Qi Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural Ordinary Differential Equations. NeurIPS, 2018. [2] Patrick Kidger, James Morrill, James Foster, and Terry J. Lyons. Neural controlled differential equations for irregular time series. NeurIPS, 2020. Augmented ODE Second order

Slide 26

Slide 26 text

Full-sequence & Multi-stroke format • Exact format of 𝑠(𝑡) is important

Slide 27

Slide 27 text

Full-sequence & Multi-stroke format • Exact format of 𝑠(𝑡) is important • Either represent pen-ups as straight lines with a state bit

Slide 28

Slide 28 text

Full-sequence & Multi-stroke format • Exact format of 𝑠(𝑡) is important • Either represent pen-ups as straight lines with a state bit • Or, as a sequence of individual strokes

Slide 29

Slide 29 text

Full-sequence & Multi-stroke format • Exact format of 𝑠(𝑡) is important • Either represent pen-ups as straight lines with a state bit • Or, as a sequence of individual strokes

Slide 30

Slide 30 text

Full-sequence & Multi-stroke format • Exact format of 𝑠(𝑡) is important • Either represent pen-ups as straight lines with a state bit • Or, as a sequence of individual strokes Final state of the previous stroke is mapped to initial state of the next stroke (check the paper for more details)

Slide 31

Slide 31 text

Full-sequence & Multi-stroke format • Exact format of 𝑠(𝑡) is important • Either represent pen-ups as straight lines with a state bit • Or, as a sequence of individual strokes Final state of the previous stroke is mapped to initial state of the next stroke (check the paper for more details) “events”

Slide 32

Slide 32 text

Training/Implementation tricks

Slide 33

Slide 33 text

Training/Implementation tricks • Sin/Cos activation for dynamics functions • High frequency temporal changes in trajectory

Slide 34

Slide 34 text

Training/Implementation tricks • Sin/Cos activation for dynamics functions • High frequency temporal changes in trajectory • Continuous noise augmentation • More intuitive in continuous-time case

Slide 35

Slide 35 text

Reconstruction & Generation

Slide 36

Slide 36 text

Reconstruction & Generation • Faithful reconstruction, just like deterministic RNN-RNN

Slide 37

Slide 37 text

Reconstruction & Generation • Faithful reconstruction, just like deterministic RNN-RNN • Generation by injecting noise into latent space • RNN-RNNs break continuity due to autoregression.

Slide 38

Slide 38 text

Reconstruction & Generation • Faithful reconstruction, just like deterministic RNN-RNN • Generation by injecting noise into latent space • RNN-RNNs break continuity due to autoregression. One-shot Generation

Slide 39

Slide 39 text

Inherently smooth latent space

Slide 40

Slide 40 text

Inherently smooth latent space • Due to continuous nature of latent-to-decoder mapping, “SketchODE” enjoys inherent continuity

Slide 41

Slide 41 text

Inherently smooth latent space • Due to continuous nature of latent-to-decoder mapping, “SketchODE” enjoys inherent continuity

Slide 42

Slide 42 text

Inherently smooth latent space • Due to continuous nature of latent-to-decoder mapping, “SketchODE” enjoys inherent continuity One-shot Interpolation SketchODE SketchODE RNN-RNN RNN-RNN

Slide 43

Slide 43 text

Abstraction effect by squeezing frequency

Slide 44

Slide 44 text

Abstraction effect by squeezing frequency • We noticed a peculiar property, i.e. “Abstraction effect”

Slide 45

Slide 45 text

Abstraction effect by squeezing frequency • We noticed a peculiar property, i.e. “Abstraction effect” • Thanks to the periodic nature of activations and their frequencies

Slide 46

Slide 46 text

Abstraction effect by squeezing frequency • We noticed a peculiar property, i.e. “Abstraction effect” • Thanks to the periodic nature of activations and their frequencies Decreasing frequency content

Slide 47

Slide 47 text

Abstraction effect by squeezing frequency • We noticed a peculiar property, i.e. “Abstraction effect” • Thanks to the periodic nature of activations and their frequencies Decreasing frequency content Refer to the paper for more details

Slide 48

Slide 48 text

Thank You ! Check out the project page @ https://ayandas.me/sketchode