This talk is a technical deep dive into the world of gesture events. I'll introduce the work that has been and is currently being done in YUI Gestures and how you can use gesture events to create multi-device user experiences.
screen made by a mouse cursor, pen, touch (including multi-touch), or other pointing input device. This model makes it easy to write sites and applications that work well no matter what hardware the user has.
- Available Today Available Today flick flickup, flickdown flickleft, flickright direction velocity PULL #1323 PULL #1323 gesturemove* Dual Listener Support direction deltaX deltaY Pull #1309 See Demo move moveup, movedown moveleft, moveright, ability to lock axis direction deltaX deltaY See Demo See Demo NEW APIs
flick listeners 2. Store the [x1,y1] of gesturemovestart 3. Compare [x1,y1] to [x2,y2] from gesturemove to determine direction + delta 4. Compare [x1,y1] to [x2,y2] from gesturemoveend to determine direction + delta 5. In callbacks, branch logic based on direction 6. Make sure that these do not get called twice in devices that support mouse and touch, but yet, also works with both.
gesturemoveend event 2. Inside move callback, get info with e.drag.direction and e.drag.deltaX 3. Transition to previous and next panels with flickleft and flickright 4. Inside gesturemoveend callback, get info with e.gesture.deltaX and e.gesture.dirX to decide whether or not to transition panels.