Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Real-Time Data-Driven Interactive Rough Sketch Inking

Real-Time Data-Driven Interactive Rough Sketch Inking

We present an interactive approach for inking, which is the process of turning a pencil rough sketch into a clean line drawing. The approach, which we call the Smart Inker, consists of several "smart" tools that intuitively react to user input, while guided by the input rough sketch, to efficiently and naturally connect lines, erase shading, and fine-tune the line drawing output. Our approach is data-driven: the tools are based on fully convolutional networks, which we train to exploit both the user edits and inaccurate rough sketch to produce accurate line drawings, allowing high-performance interactive editing in real-time on a variety of challenging rough sketch images. For the training of the tools, we developed two key techniques: one is the creation of training data by simulation of vague and quick user edits; the other is a line normalization based on learning from vector data. These techniques, in combination with our sketch-specific data augmentation, allow us to train the tools on heterogeneous data without actual user interaction. We validate our approach with an in-depth user study, comparing it with professional illustration software, and show that our approach is able to reduce inking time by a factor of 1.8x while improving the results of amateur users.

シモセラ エドガー

August 15, 2018
Tweet

More Decks by シモセラ エドガー

Other Decks in Research

Transcript

  1. Real-Time Data-Driven Interactive Rough Sketch Inking Edgar Simo-Serra, Satoshi Iizuka,

    Hiroshi Ishikawa Wednesday, August 15, 2018 Waseda University
  2. Motivation “1. The inker’s main purpose is to translate the

    penciller’s graphite pencil lines into reproducible, black, ink lines. 2. The inker must honor the penciller’s original intent while adjusting any obvious mistakes. 3. The inker determines the look of the finished art.” — Gary Martin, The Art of Comic Book Inking [1997] 2
  3. Interactive Neural Networks • Feed-forward fully convolutional neural network •

    Input rough sketch and user edit are concatenated channel-wise Input Output Model User Edit User + 3
  4. Interactive Neural Networks - Related Work • User input is

    treated as an additional image channel • Training user input is sampled from ground truth • Grayscale image colorization [Sangkloy+ 2017, Zhang+ 2017] Real-Time User-Guided Image Colorization with Learned Deep Priors. Richard Zhang et al. SIGGRAPH 2017 4
  5. Interactive Neural Networks - Related Work • User input is

    treated as an additional image channel • Training user input is sampled from ground truth • Grayscale image colorization [Sangkloy+ 2017, Zhang+ 2017] • Not directly applicable to the rough sketch inking problem Input User Edit Baseline Ours 4
  6. Interactive Neural Networks - Related Work • User input is

    treated as an additional image channel • Training user input is sampled from ground truth • Grayscale image colorization [Sangkloy+ 2017, Zhang+ 2017] • Not directly applicable to the rough sketch inking problem • How to train an interactive network for inking? Input User Edit Baseline Ours 4
  7. Proposed Framework • Main contributions • Line width normalization •

    Simulation of user edits • Three different smart tools • Evaluation with a perceptual user study Input rough sketch Output Smart Inker Canvas Inker Pen Inker Brush Smart Eraser ©Krenz Cushart 5
  8. Training Framework 1. Line width normalization 2. Simulation of user

    edits Train User Edit Simulation Smart Inker Training Data Line Normalization 0 0 Dataset 6
  9. Training Framework - Line width normalization 1. Line width normalization

    2. Simulation of user edits Train User Edit Simulation Smart Inker Training Data Line Normalization 0 0 Dataset 7
  10. Training Framework - Simulation of user edits 1. Line width

    normalization 2. Simulation of user edits Train User Edit Simulation Smart Inker Training Data Line Normalization 0 0 Dataset 8
  11. Training Framework - Simulation of user edits Input Data Pair

    Line Drawing Rough Sketch Sampled Regions 8
  12. Training Framework - Simulation of user edits Input Data Pair

    Line Drawing Rough Sketch Sampled Regions Add Edits and Noise 8
  13. Training Framework - Simulation of user edits Input Data Pair

    Line Drawing Rough Sketch Sampled Regions Add Edits and Noise 8
  14. Training Framework - Simulation of user edits Input Data Pair

    Line Drawing Rough Sketch Sampled Regions Add Edits and Noise 8
  15. Smart Tools • Inker Brush • Sloppy and fast line

    manipulation Input Automatic Edit Ours 9
  16. Smart Tools • Smart Eraser • Takes into account rough

    sketch when erasing Input Automatic Edit Ours 9
  17. Training L(y, y∗) = |(y − y∗) L1 loss ⊙

    (1 + γ (1 − y∗)) Weight lines with γ |1 • Using L1 loss • Change weight of lines with γ 103 104 105 103 104 105 103 104 105 103 104 105 Input [Simo-Serra+ 2016] Baseline Ours ©David Revoy www.davidrevoy.com 10
  18. Model • Similar to model of [Simo-Serra+ 2016] • 24

    layer fully convolutional neural network • Number of filters optimized for real-time performance • Roughly three times the performance Approach Parameters 10242px 15122px 20482px 25602px [Simo-Serra+] 2016 44,551,425 238.8ms 562.4ms 984.7ms 1.59s Ours 12,795,169 89.9ms 225.5ms 382.7ms 592.9ms 11
  19. Dataset • 288 rough sketch and line drawing pairs •

    More challenging than previous works 12
  20. User Study • Comparison (time) with proposed approach vs Clip

    Studio Pro • Total of 10 users and 10 unique images • Each user processes random 5 images with each tool • Total average time of 2.8 hours per user • Overall 1.8× speed-up with ours Some of the images used in the user study 15
  21. User Study PT Ours 0 1000 2000 3000 Time (s)

    PT Ours 0 200 400 600 800 1000 Time (s) Time Input Amateur Experienced Edit Ours ©David Revoy www.davidrevoy.com 15
  22. User Study ©David Revoy www.davidrevoy.com PT Ours 0 200 400

    600 800 1000 Time (s) PT Ours 0 200 400 600 800 1000 Time (s) Time Input Amateur Experienced Edit Ours ©Krenz Cushart 15
  23. To conclude http://hi.cs.waseda.ac.jp/~esimo/research/inking/ • Interactive rough sketch inking framework •

    Line width normalization • User edit simulation ©Edgar Simo-Serra 17