Andreas Bulling, email@example.com
Enkelejda Kasneci, firstname.lastname@example.org
Christian Lander, email@example.com
Eye Tracking and gaze-based human-computer interfaces have become a valid modality in desktop settings since remote eye tracking is affordable. However, with the growing interest in wearable devices (e.g., smartphones, smartwatches, and eyewear) and the maturation of AI methods (e.g., machine learning), mobile eye tracking and eye-based interaction is becoming increasingly important. PETMEI 2018 focuses on the pervasive eye tracking paradigm as a trailblazer to take eye tracking into the wild, to realistic and pervasive settings. With respect to the key research challenges, new applications and their implications for pervasive eye tracking in ubiquitous computing, we aim at stimulating and exploring the creativity of these communities. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking.
|Friday, June 15, 2018 (16:00-18:00) and Saturday, June 16, 2018 (10:30 - 12:30), Room S305|
|Session 1 (Room S305)||Session Chair: Thiago Santini|
|June 15, 16:00-18:00|
Introducing I2Head Database
Ion Martinikorena, Rafael Cabeza, Arantxa Villanueva, Sonia Porta
Making stand-alone PS-OG technology tolerant to the equipment shifts
Raimondas Zemblys, Oleg Komogortsev
Crowdsourcing pupil annotation datasets: boundary vs. center, what performs better?
David Gil de Gómez Pérez, Matti Suokas, Roman Bednarik
|Session 2 (Room S305)||Session Chair: Enkelejda Kasneci|
|June 16, 10:30-12:30|
Gaze-based Interest Detection on Newspaper Articles
Ms Soumy Jacob, Shoya Ishimaru, Syed Saqib Bukhari, Andreas Dengel
The Art of Pervasive Eye Tracking: Unconstrained Eye Tracking in the Austrian Gallery Belvedere
Thiago Santini, Hanna Brinkmann, Luise Reitstätter, Helmut Leder, Raphael Rosenberg, Wolfgang Rosenstiel, Enkelejda Kasneci
Eye tracking in naturalistic badminton play – comparing visual gaze pattern strategy of professional and amateur players
Nithiya Shree Uppara, Aditi Ashutosh Mavalankar, Kavita Vemuri
PETMEI '18- Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based InteractionFull Citation in the ACM Digital Library
I2Head database has been created with the aim to become an optimal reference for low cost gaze estimation. It exhibits the following outstanding characteristics: it takes into account key aspects of low resolution eye tracking technology; it combines images of users gazing at different grids of points from alternative positions with registers of user's head position and it provides calibration information of the camera and a simple 3D head model for each user. Hardware used to build the database includes a 6D magnetic sensor and a webcam. A careful calibration method between the sensor and the camera has been developed to guarantee the accuracy of the data. Different sessions have been recorded for each user including not only static head scenarios but also controlled displacements and even free head movements. The database is an outstanding framework to test both gaze estimation algorithms and head pose estimation methods.
Tracking users' gaze in virtual reality headsets allows natural and intuitive interaction with virtual avatars and virtual objects. Moreover, a technique known as foveated rendering can help save computational resources and enable hi-resolution but lightweight virtual reality technologies. Predominantly, eye-tracking hardware in modern VR headsets consist of infrared camera(s) and LEDs. Such hardware, together with image processing software consumes a substantial amount of energy, and, provided that hi-speed gaze detection is needed, might be very expensive. A promising technique to overcome these issues is photo-sensor oculography (PS-OG), which allows eye-tracking with high sampling rate and low power consumption. However, the main limitation of the previous PS-OG systems is their inability to compensate for the equipment shifts. In this study, we employ a simple multi-layer perceptron neural network to map raw sensor data to gaze locations and report its performance for shift compensation. Modeling and evaluation is done via a simulation.
Pupil-related feature detection is one of the most common approaches used in the eye-tracking literature and practice. Validation and benchmarking of the detection algorithms relies on accurate ground-truth datasets, but creating of these is costly. Many approaches have been used to obtain human based annotations. A recent proposal to obtain these work-intensive data is through a crowdsourced registration of the pupil center, in which a large number of users provide a single click to indicate the pupil center [Gil de Gómez Pérez and Bednarik 2018a]. In this paper we compare the existing approach to a method based on multiple clicks on the boundary of the pupil region, in order to determine which approach provides better results. To compare both methods, a new data collection was performed over the same image database. Several metrics were applied in order to evaluate the accuracy of the two methods.
Eye tracking measures have been used to recognize cognitive states involving mental workload, comprehension, and self-confidence in the task of reading. In this paper, we present how these measures can be used to detect the interest of a reader. From the reading behavior of 13 university students on 18 newspaper articles, we have extracted features related to fixations, saccades, blinks and pupil diameters to detect which documents each participant finds interesting or uninteresting. We have classified their level of interests into four classes with an accuracy of 44% using eye movements, and it has increased to 62% if a survey about subjective comprehension is included. This research can be incorporated in the real-time prediction of a user's interest while reading, for the betterment of future designs of human-document interaction.
Pervasive mobile eye tracking provides a rich data source to investigate human natural behavior, providing a high degree of ecological validity in natural environments. However, challenges and limitations intrinsic to unconstrained mobile eye tracking makes its development and usage to some extent an art. Nonetheless, researchers are pushing the boundaries of this technology to help assess museum visitors' attention not only between the exhibited works, but also within particular pieces, providing significantly more detailed insights than traditional timing-and-tracking or external observer approaches. In this paper, we present in detail the eye tracking system developed for a large scale fully-unconstrained study in the Austrian Gallery Belvedere, providing useful information for eye-tracking system designers. Furthermore, the study is described, and we report on usability and real-time performance metrics. Our results suggest that, although the system is comfortable enough, further eye tracker improvements are necessary to make it less conspicuous. Additionally, real-time accuracy already suffices for simple applications such as audio guides for the majority of users even in the absence of eye-tracker slippage compensation.
Eye tracking in naturalistic badminton play: comparing visual gaze pattern strategy in world-rank and amateur player
A professional player's expertise rests on the ability to predict action by optimally extracting the opponent's postural cues. Eye tracking (head-mounted system) data in a naturalistic singles badminton play was collected from one professional world-ranked player facing five amateur players (10 serves or 50 trials) and two amateurs playing against four other amateur players each (10 serves or 80 trials). The visual gaze on the opponent body, segregated into 3 areas-of-interest covering the feet, face/torso, and hand/racket of the opponent and the shuttle, was analysed for a) the period just before the serve, b) while receiving the serve and c) the entire rally. The comparative analysis shows the first area-of-interest for professional player as the opponent's feet while executing the serve and the hand/racket when receiving a serve. On the other hand, the amateur players show no particular strategy of fixation location either for the serve task or while facing a serve. The average fixation duration (just before serve) for the professional was 0.96s and for the amateurs it was 1.48s. The findings highlight the differences in the postural cue considered important and the preparatory time in professional and amateur players. We believe, analytical models from dynamic gaze behavior in naturalistic game conditions as applied in this study can be used for enhancing perceptual-cognitive skills during training.