Slide 1

Slide 1 text

confidential not for distribution A Full Stack 2015 London presentation by Joel Lord Robocalypse: Controlling Nodebots with Kinect gestures

Slide 2

Slide 2 text

Overview About Me Why The Goals What Exists What I wanted Gestures 101 Demo

Slide 3

Slide 3 text

About Me

Slide 4

Slide 4 text

About Me Joel Lord Development Manager Javascript junkie, technology enthusiast, IoT hobbyist. Macadamian http://www.macadamian.com Twitter: @joel__lord Github: github.com/joellord

Slide 5

Slide 5 text

Stereotypes I have never fought a polar bear with my bare hands

Slide 6

Slide 6 text

Stereotypes My yard last April

Slide 7

Slide 7 text

Why ?

Slide 8

Slide 8 text

Why •  Worked with Nodebots for a while now •  Wanted to explore other possibilities of NodeJs •  Because controlling a robot with a Kinect is pretty cool

Slide 9

Slide 9 text

The Goals

Slide 10

Slide 10 text

The Goals •  Needed a way to use events to tell the robot to perform an action •  Gestures seemed the best way to achieve this

Slide 11

Slide 11 text

What currently exists

Slide 12

Slide 12 text

What current exists Basic Nodebot Stack •  NodeJs (https://nodejs.org/) •  Socket.io (http://socket.io/) •  Johnny-Five (http://johnny-five.io/)

Slide 13

Slide 13 text

What current exists node-openni •  Connects the Kinect to NodeJs using OpenNI •  Very basic event for almost everything •  No gestures !

Slide 14

Slide 14 text

What I needed

Slide 15

Slide 15 text

What I Needed •  Support for a full “skeleton” •  Support for gestures

Slide 16

Slide 16 text

What I Needed

Slide 17

Slide 17 text

Gestures 101

Slide 18

Slide 18 text

Gestures 101 The Basics •  Is the initial condition met? •  Is the condition still met? •  Is the final condition met?

Slide 19

Slide 19 text

Gestures 101 What do I need for gestures? •  A skeleton with an event when it changes •  Know the position of the COM •  Have a “base unit”

Slide 20

Slide 20 text

Gestures 101 Introducing kinect-gestures •  Gesture detection using a base class •  Tracks the skeleton for an initial condition •  Checks every 100ms to see if the condition is still met •  If the final condition is met, trigger an event •  Other classes are used to define the actual gestures

Slide 21

Slide 21 text

Robocalypse Demo Let’s look at some code !

Slide 22

Slide 22 text

Next few steps •  Return a more friendly coordinate system for skeleton •  Add more information to the returned •  Add more gestures (jump, wave, kick)

Slide 23

Slide 23 text

Questions?

Slide 24

Slide 24 text

Thank you. @joel__lord joellord

Slide 25

Slide 25 text

macadamian.com