data with intervals of 125 ms (8 Hz). • The raw data output includes 14 values (7 channels on each brain hemisphere: AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, and AF4) and two values of the acceleration of the head when leaning (gyrox and gyroy). • The affectiv suite reports 5 emotions: engagement, boredom, excitement, frustration, and meditation. • And the expressiv suite reports facial gestures: blink, wink (left and right), look (left and right), raise brow, furrow brow, smile, clench, smirk (left and right), and laugh.
CMS/DRL configuration [6][7] [6] Sharbrough F, Chatrian G-E, Lesser RP, Lüders H, Nuwer M, Picton TW. American Electroencephalographic Society Guidelines for Standard Electrode Position Nomenclature. J. Clin. Neurophysiol 8: 200-2. [7] Electroencephalography. Retrieved November 14th, 2010, from Electric and Magnetic Measurement of the Electric Activity of Neural Tissue: www.bem.fi/book/13/13.htm
(date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). UserID Identifies the user. An integer value. Wireless Signal Status Shows the strength of the signal. The value is from 0 to 4, being 4 the best one. Blink, Wink Left and Right, Look Left and Right, Raise Brow, Furrow, Smile, Clench, Smirk Left and Right, Laugh Part of the expressive suite. Values between 0 and 1, being 1 the value that represents the highest power/probability for this emotion. Short Term and Long Term Excitement, Engagement / Boredom, Meditation, Frustration Part of the affective suite. Values between 0 and 1, being 1 the value that represents the highest power/probability for this emotion. AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4. Raw data coming from each of the 14 channels. The name of these fields where defined according with the CMS/DRL configuration [XXX][XXXX]. Values from 4000 and higher. GyroX and GyroY Information about how the head moves/accelerates according with X and Y axis accordingly.
infer affective states by capturing images of the users’ facial expressions and head movements. • We are going to show the capabilities of face-based emotion recognition systems using a simple 30 fps USB webcam and software from MIT Media Lab [8]. [8] R. E. Kaliouby and P. Robinson, “Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures,” Proc. Conference on Computer Vision and Pattern Recognition Workshop (CVPRW ‘04), IEEE Computer Society, June 2004, Volume 10, p. 154.
analysis, tagging and inference of cognitive affective mental states from facial video in real-time. § This framework combines vision-based processing of the face with predictions of mental state models to interpret the meaning underlying head and facial signals overtime. § It provides results at intervals of 100 ms approximately (10 Hz). § With this system it is possible to infer emotions such as: agreeing, concentrating, disagreeing, interested, thinking, and unsure. (Ekman and Friesen 1978) – Facial Action Coding System, 46 actions (plus head movements).
(date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). Agreement, Concentrating, Disagreement, Interested, Thinking, Unsure This value shows the probability of this emotion being present on the user at a particular time (frame). This value is between 0 to 1. If the value is -1 it means it was not possible to define an emotion. This happen the user's face is out of the camera focus.
reports data with intervals of 100 ms (10Hz). • The output provides data concerning attention direction (gaze-x, gaze-y), duration of fixation, and pupil dilation.
timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). GazePoint X The horizontal screen position for either eye or the average for both eyes. This value is also used for the fixation definition. 0 is the left edge, the maximum value of the horizontal screen resolution is the right edge. GazePoint Y The vertical screen position for either eye or the average for both eyes. This value is also used for the fixation definition. 0 is the bottom edge, the maximum value of the vertical screen resolution is the right edge. Pupil Left Pupil size (left eye) in mm. Variates Validity Left Validity of the gaze data. 0 to 4. 0 if the eye is found and the tracking quality good. If the eye cannot be found by the eye tracker, the validity code will be 4. Pupil Right Pupil size (left eye) in mm. Variates Validity Right Validity of the gaze data. 0 to 4. 0 if the eye is found and the tracking quality good. If the eye cannot be found by the eye tracker, the validity code will be 4. FixationDuration Fixation duration. The time in milliseconds that a fixation lasts. Variates. Event Events, automatic and logged, will show up under Event. Variates. AOI Areas Of Interests if fixations on multiple AOIs are to be written on the same row. Variates.
engagement fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while the level of shading represents the intensity of the emotion.
frustration fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while the level of shading represents the intensity of the emotion.
boredom fixation points of expert player playing in expert-mode. The size of the circle represents the duration of the fixation in that point, while the level of shading represents the intensity of the emotion.
engagement gaze points (above a threshold of 0.6) of a user reading a material with seductive details (i.e. cartoons). For this user the text on the bottom part of the first column was engaging.
frustration gaze points (above a threshold of 0.6) of a user reading a material with seductive details (i.e. cartoons). Looking the cartoon is related with high frustration level.
boredom gaze points (above a threshold of 0.5) of a user while reading a material with seductive details (i.e. cartoons). Notice that the text in the middle part of the second column of that page was boring.
physiological functions of an individual. • Arousal detection. Measures the electrical conductance of the skin, which varies with its moisture level that depends on the sweat glands, which are controlled by the sympathetic, and parasympathetic nervous systems. [10] • Hardware designed by MIT Media Lab. • It is a Wireless Bluetooth device that reports data in intervals of 500 ms approximately (2Hz) [10] M. Strauss, C. Reynolds, S. Hughes, K. Park, G. McDarby, and R.W. Picard, “The HandWave Bluetooth Skin Conductance Sensor,” Proc. First International Conference on Affective Computing and Intelligent Interaction (ACII 05), Springer-Verlang, Oct. 2005, pp. 699-706, doi:10.1007/11573548_90.
timestamp (date and time) of the computer running the system. It could be used to synchronize the data with other sensors. Format "yymmddhhmmssSSS" (y - year, m - month, d - day, h - hour, m - minutes, s - seconds, S - milliseconds). Battery Voltage Level of the battery voltage. 0 - 3 Volts Conductance Level of arousal. 0 - 3 Volts