Robocalypse: Controlling Nodebots with Kinect gestures

Ec1016a9a63da65998b75f35b6ceb3bb?s=47 Joel Lord
October 28, 2015

Robocalypse: Controlling Nodebots with Kinect gestures

A Full Stack 2015 London presentation by Joel Lord. Links to full video to come...

Ec1016a9a63da65998b75f35b6ceb3bb?s=128

Joel Lord

October 28, 2015
Tweet

Transcript

  1. confidential not for distribution A Full Stack 2015 London presentation

    by Joel Lord Robocalypse: Controlling Nodebots with Kinect gestures
  2. Overview About Me Why The Goals What Exists What I

    wanted Gestures 101 Demo
  3. About Me

  4. About Me Joel Lord Development Manager Javascript junkie, technology enthusiast,

    IoT hobbyist. Macadamian http://www.macadamian.com Twitter: @joel__lord Github: github.com/joellord
  5. Stereotypes I have never fought a polar bear with my

    bare hands
  6. Stereotypes My yard last April

  7. Why ?

  8. Why •  Worked with Nodebots for a while now • 

    Wanted to explore other possibilities of NodeJs •  Because controlling a robot with a Kinect is pretty cool
  9. The Goals

  10. The Goals •  Needed a way to use events to

    tell the robot to perform an action •  Gestures seemed the best way to achieve this
  11. What currently exists

  12. What current exists Basic Nodebot Stack •  NodeJs (https://nodejs.org/) • 

    Socket.io (http://socket.io/) •  Johnny-Five (http://johnny-five.io/)
  13. What current exists node-openni •  Connects the Kinect to NodeJs

    using OpenNI •  Very basic event for almost everything •  No gestures !
  14. What I needed

  15. What I Needed •  Support for a full “skeleton” • 

    Support for gestures
  16. What I Needed

  17. Gestures 101

  18. Gestures 101 The Basics •  Is the initial condition met?

    •  Is the condition still met? •  Is the final condition met?
  19. Gestures 101 What do I need for gestures? •  A

    skeleton with an event when it changes •  Know the position of the COM •  Have a “base unit”
  20. Gestures 101 Introducing kinect-gestures •  Gesture detection using a base

    class •  Tracks the skeleton for an initial condition •  Checks every 100ms to see if the condition is still met •  If the final condition is met, trigger an event •  Other classes are used to define the actual gestures
  21. Robocalypse Demo Let’s look at some code !

  22. Next few steps •  Return a more friendly coordinate system

    for skeleton •  Add more information to the returned •  Add more gestures (jump, wave, kick)
  23. Questions?

  24. Thank you. @joel__lord joellord

  25. macadamian.com