Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Real-Life Activity Recognition - Focus on Recognizing Reading Activities

Kai
August 23, 2013

Real-Life Activity Recognition - Focus on Recognizing Reading Activities

CVBAR 2013 Keynote http://imlab.jp/cbdar2013/index.shtml#keynote

Most applications in intelligent environments so far strongly rely on specific sensor combinations at predefined positions, orientations etc. While this might be acceptable for some application domains (e.g. industry), it hinders the wide adoption of pervasive computing. How can we extract high level information about human actions and complex real world situations from heterogeneous ensembles of simple, often unreliable sensors embedded in commodity devices?

This talk mostly focuses on how to use body-worn devices for activity recognition in general, and how to combine them with infrastructure sensing and computer vision approaches for a specific high level human activity, namely better understanding knowledge acquisition (e.g. recognizing reading activities).

We discuss how placement variations of electronic appliances carried by the user influence the possibility of using sensors integrated in those appliances for human activity recognition. I categorize possible variations into four classes: environmental placements, placement on different body parts (e.g. jacket pocket on the chest, vs. a hip holster vs. the trousers pocket), small displacement within a given coarse location (e.g. device shifting in a pocket), and different orientations.For each of these variations, I give an overview of our efforts to deal with them.

In the second part of the talk, we combine several pervasive sensing approaches (computer vision, motion-based activity recognition etc.) to tackle the problem of recognizing and classifying knowledge acquisition tasks with a special focus on reading. We discuss which sensing modalities can be used for digital and offline reading recognition, as well as how to combine them dynamically.

Kai

August 23, 2013
Tweet

More Decks by Kai

Other Decks in Science

Transcript

  1. Focus on Recognizing Reading Activities Real-Life Activity Recognition Kai Kunze

    Osaka Prefecture University Friday, 23 August 13
  2. Kai Kunze - Real-Life Activity Recognition Overview Computers are Everywhere:

    How can they help us? Application Scenarios Problems in Activity Recognition Some Predictions Towards “User-Driven” Document Analysis Detecting Reading Activities 2 Friday, 23 August 13
  3. Kai Kunze - Real-Life Activity Recognition The Computer of the

    21 Century On of the first to coin the idea of ubiquitous/ pervasive computing was Mark Weiser in a visionary article. “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” “...Hundreds of computers in a room could seem intimidating at first, [...] these hundreds of computers will come to be invisible to common awareness. People will simply use them unconsciously to accomplish everyday tasks.” -Mark Weiser, PARC, September 1991 3 Friday, 23 August 13
  4. Kai Kunze - Real-Life Activity Recognition 4 With the computers

    surrounding us in everyday life, The performance bottle neck is Human Attention. For Computing to become invisible, the interface needs to vanish. Computing needs to become pro-active Context-Aware Systems Friday, 23 August 13
  5. Kai Kunze - Real-Life Activity Recognition Activity Recognition 5 Research

    by: Georg Ogris, Thomas Stiefmaier Friday, 23 August 13
  6. Kai Kunze - Real-Life Activity Recognition 6 production and maintenance

    sport/leisure assisted living scenarios emergency response Applications healthcare Friday, 23 August 13
  7. Kai Kunze - Real-Life Activity Recognition 7 Applications Friday, 23

    August 13
  8. Kai Kunze - Real-Life Activity Recognition 8 Applications Friday, 23

    August 13
  9. Kai Kunze - Real-Life Activity Recognition 9 A Zeiss HMD

    Study Kunze, K.,Wagner, F., Kartal, E., Morales Kluge, E., and Lukowicz, P. Does Context Matter ? - A Quantitative Evaluation in a Real World Maintenance Scenario. In Proceedings of the 7th international Conference on Pervasive Computing Nara, Japan, May 11 - 14, 2009. Applications Friday, 23 August 13
  10. Kai Kunze - Real-Life Activity Recognition well defined, application specific

    sensor setups State of the Art 10 Friday, 23 August 13
  11. Kai Kunze - Real-Life Activity Recognition Problems Can you imagine

    your grandmother wearing this? 11 Friday, 23 August 13
  12. Kai Kunze - Real-Life Activity Recognition Problems Can you imagine

    your grandmother wearing this? 12 Maybe, ... if you are a phD. student in Wearable Computing Friday, 23 August 13
  13. Kai Kunze - Real-Life Activity Recognition Real Life Activity Recognition

    Cables and bulky sensors will vanish What about battery? 13 Friday, 23 August 13
  14. Kai Kunze - Real-Life Activity Recognition Real Life Activity Recognition

    Cables and bulky sensors will vanish same holds for the issue of battery life However, can you imagine your grandma putting on 10 well defined sensors in well-defined places every day? ... and changing them for every situation ? 14 Friday, 23 August 13
  15. Kai Kunze - Real-Life Activity Recognition 15 Have a system

    self-configure to existing sensors and self-adapt to changes rather then design a sensing system for a specific application Friday, 23 August 13
  16. Kai Kunze - Real-Life Activity Recognition 16 • sensor self

    description • especially with respect to dynamic changes • system self-correction • especially with respect to deterioration • system self-configuration Friday, 23 August 13
  17. Kai Kunze - Real-Life Activity Recognition Real World Devices •

    sensor self description • especially with respect to dynamic changes • system self-correction • especially with respect to deterioration • system self-configuration Friday, 23 August 13
  18. Kai Kunze - Real-Life Activity Recognition Recognizing Onbody Placement 18

    K. Kunze and P. Lukowicz. Using acceleration signatures from everyday activities for on- body device location. 11th IEEE International Symposium on Wearable Computers, Sep 2007. K. Kunze, P. Lukowicz, H. Junker, and G. Troester. Where am i: Recognizing on- body positions of wearable sensors. LOCA’04: International Workshop on Location and Context Awareness , Jan 2005. Friday, 23 August 13
  19. Kai Kunze - Real-Life Activity Recognition Walking 19 lower arm

    trouser pocket Friday, 23 August 13
  20. Kai Kunze - Real-Life Activity Recognition Onbody Recognition acceleration features:

    std, mean, fft center of mass, duration of rest period gyro features: pca angle, frequency range power both: sum of the differences in variance per axis 20 calibration rest period? feature extraction particle filter smoothing no yes HMM classification acceleration angular velocity segments with enough activity. If the varianc towards zero and the magnitude towards 9.81 vector (see section 5.4 for references and mo shifts and displacements within a body part, calibration for one axis, as described by Mizell on this vertical axis and the norm vector of the if not indicated otherwise with the feature. A the length of the last calibration/ rest period. Our initial approach was to use a mixture the frame-by-frame case, presented in 3.1. Yet, the following, we describe the features we calc Table 3.2: Features used for the Hi Accelerometer Gyrosco standard deviation and mean PCA an fft center of mass frequen duration of the last rest period (below The sum of the norm of the differenc malized axes a1 , a2 , a3 divided by the v 1/2 n i=1 n j=1,j<i | var(ai) var(aj) | var(norm) Friday, 23 August 13
  21. Kai Kunze - Real-Life Activity Recognition Experimental Evaluation 5 data

    sets house work to bicycle repair 3 to 7 participants per data set 1 real life data set age range 17 - over 60 4-5 on-body placements 21 ETH material 1. 5x Xsens (according to experiment description). 2. Xbus: S/N#: 00130157 (incl. Serial cable + USB converter). 3. 5x Xsens strap 4. 5x Xsens cable 5. Motion jacket around 30 hours of sensor data Friday, 23 August 13
  22. Kai Kunze - Real-Life Activity Recognition Results Randomly switching location

    about every 10min (>100 hours of data from 5 experiments and >10 subjects) Friday, 23 August 13
  23. Kai Kunze - Real-Life Activity Recognition New Sensors New sensors

    may appear frequently (How) can they be integrated in a recognition system without any user input ? Friday, 23 August 13
  24. Kai Kunze - Real-Life Activity Recognition Abstract Features Example: rate

    of rotation can be derived from gyros as well as from magnetic field sensors Bahle, Kunze ISWC 10 Gyroscope Compass Friday, 23 August 13
  25. Kai Kunze - Real-Life Activity Recognition 0 100 200 300

    400 500 600 700 −60 −40 −20 0 20 40 60 0 100 200 300 400 500 600 700 −60 −40 −20 0 20 40 60 Abstract Features 25 Gernot Bahle, Paul Lukowicz, Kai Kunze, Koichi Kise, I see you: How to improve wearable activity recognition by leveraging information from environmental cameras Percom, San Diego, 2013. Best Work in Progress Paper A 3D camera can be used to derive acceleration at particular body locations Friday, 23 August 13
  26. Kai Kunze - Real-Life Activity Recognition The Need for Open

    Data Sets + Tools 26 ETH material 1. 5x Xsens (according to experiment description). 2. Xbus: S/N#: 00130157 (incl. Serial cable + USB converter). 3. 5x Xsens strap 4. 5x Xsens cable 5. Motion jacket Friday, 23 August 13
  27. Kai Kunze - Real-Life Activity Recognition The Opportunity Dataset 27

    http://contextdb.org Friday, 23 August 13
  28. Kai Kunze - Real-Life Activity Recognition CRN Toolbox Bannach, Lukowicz

    et. al. IEEE Pervasive Mag 2008, ARCS 2006 • sensor abstraction, • algorithm abstraction, • auto synchronization • service discovery http://crnt.sourceforge.net/ Friday, 23 August 13
  29. Kai Kunze - Real-Life Activity Recognition Data Recording https://redmine.wearcom.org/projects/mass Friday,

    23 August 13
  30. Kai Kunze - Real-Life Activity Recognition Data Management, Annotation, Sharing

    Friday, 23 August 13
  31. Kai Kunze - Real-Life Activity Recognition Classifier Training, On-line Context

    Recognition Friday, 23 August 13
  32. Kai Kunze - Real-Life Activity Recognition Dynamic Discovery / Usage

    dynamic binding Friday, 23 August 13
  33. Kai Kunze - Real-Life Activity Recognition Microp G Com Bluet

    Wireless Accelero Gyrosc Cam predictions for 2015-2020 1. complex activities 2. accuracy well over 95% 3. real people in real environments not students in the lab ! Friday, 23 August 13
  34. Kai Kunze - Real-Life Activity Recognition So What’s next? People

    are communicating more and more in digitally default will be public how do you learn today? use of youtube, ifixit etc. Increase in Experience Sharing next Step are video live stream 34 Fundamental Changes to Knowledge and Learning Friday, 23 August 13
  35. Kai Kunze - Real-Life Activity Recognition Towards “user-centered” Document Analysis

    ... Traditionally, Document Analysis focuses on the object 35 Documents as a structured source of information Friday, 23 August 13
  36. Kai Kunze - Real-Life Activity Recognition Not completely, .... there

    are some exceptions especially in relation with historic documents and handwriting recognition: topics related to: identity (who wrote it) authenticity (is it the right person at the right time) What if we broaden the topics and change our focus? 36 Friday, 23 August 13
  37. Kai Kunze - Real-Life Activity Recognition “User-centered” Document Analysis Let’s

    focus on the readers 1. Analyze the document through the users ... What did they read? How fast? How often? How much do they understand? 2. Analyze the users through the documents 37 Friday, 23 August 13
  38. Kai Kunze - Real-Life Activity Recognition Documents 38 Similar to

    Kindle Highlights Imagine annotating documents with: “x readers stopped reading after this sentence.” “most readers felt sad after this paragraph” “the reader never saw this warning label” ... Friday, 23 August 13
  39. Kai Kunze - Real-Life Activity Recognition More insides about the

    users What do these documents tell you about my activity? 39 Friday, 23 August 13
  40. Kai Kunze - Real-Life Activity Recognition “User-centered” Document Analysis Let’s

    focus on the readers 1. Analyze the document through the users ... What did they read? How fast? How much? How much they understand? 2. Analyze the users through the documents 40 Friday, 23 August 13
  41. Kai Kunze - Real-Life Activity Recognition What are users reading

    using mobile eyetracking Using eygaze to distinguish document types 41 Kai Kunze, Andreas Bulling, Yuzuko Utsumi,Koichi Kise. I know what you are reading – Recognition of document types using mobile eye tracking, ISWC 2013, Zurich. 72 % user independent recognition (10 users, 5 document types, 5 environments) Friday, 23 August 13
  42. Kai Kunze - Real-Life Activity Recognition ... using EEG detecting

    different document types is difficult ... reading detection/ segmentation working for a small set of test users (3) user dependent close to 100 % 42 participant 1 participant 2 Textbook Newspaper Manga K. Kunze, Y. Shiga, S. Ishimaru, Y. Utsumi, K. Kise. Reading activity recognition using an off-the-shelf EEG — detecting reading activities and distinguishing genres of documents accepted at ICDAR, Washington D.C., 2013. Friday, 23 August 13
  43. Kai Kunze - Real-Life Activity Recognition “User-centered” Document Analysis Let’s

    focus on the readers 1. Analyze the document through the users ... What did they read? How fast? How much? How much they understand? 2. Analyze the users through the documents 43 Friday, 23 August 13
  44. Kai Kunze - Real-Life Activity Recognition The Wordometer Experimental setup

    Gaze overlaid on the document using Document image retrieval Accuracy around ~7 % K. Kunze, H. Kawaichi, K. Yoshimura, K. Kise. The Wordometer – Estimating the Number of Words Read Using Document Image Retrieval and Mobile Eye Tracking ICDAR 2013. Friday, 23 August 13
  45. Kai Kunze - Real-Life Activity Recognition “User-centered” Document Analysis Let’s

    focus on the readers 1. Analyze the document through the users ... What did they read? How fast? How much? How much they understand? 2. Analyze the users through the documents 45 Friday, 23 August 13
  46. Kai Kunze - Real-Life Activity Recognition How much do users

    understand? 46 K. Kunze, H. Kawaichi, K. Yoshimura, K. Kise. Towards inferring language expertise using eye tracking. accepted as Work in Progress at ACM SIGCHI Conference on Human Factors in Computing Systems, Paris, France 2013. Friday, 23 August 13
  47. Kai Kunze - Real-Life Activity Recognition “Difficult Word” Detection Eye-gaze

    translated to Document coordinate System using LLAH Horizontal projection To a line histogram Friday, 23 August 13
  48. Questions, Remarks, Violent Dissent? http://kaikunze.de twitter: @k_garten facebook: kai.kunze app.net:

    @kkai kai.kunze@gmail.com https://github.com/kkai/snsrlog 48 Friday, 23 August 13