ACM ETRA 2021 Schedule


(Time Zone CEST/Central European Summer Time (Daylight Saving Time))


24 May 2021
TIME
15:00 - 22:15
25 May 2021
TIME
TRACK I
TRACK II
10:45 - 11:00
Welcome

Posters & Demos & Videos I (11:00 - 12:00)

Session sponsored by Eyesquare logo

POSTERS:

Privacy-Preserving Eye Videos using Rubber Sheet Model (2020)
(Aayush Kumar Chaudhary and Jeff B. Pelz)

Repetition effects in task-driven eye movement analyses after longer time-spans (2021)
(Thomas Berger and Michael Raschke)

Saccades, attentional orienting and disengagement: the effects of anodal tDCS over right posterior parietal cortex (PPC) and frontal eye field (FEF) (2021)
(Lorenzo Diana, Patrick Pilastro, Edoarado N. Aiello, Aleksandra K. Eberhard-Moscicka, René M. Müri, and Nadia Bolognini)

Fixation: A universal framework for experimental eye movement research (2021)
(Mostafa Elshamy and Peter Khooshabeh)

Estimation of Visual Attention using Microsaccades in response to Vibrations in the Peripheral Field of Vision (2021)
(Takahiro Ueno and Minoru Nakayama)

Synchronization of Spontaneous Eyeblink during Formula Car Driving (2021)
(Ryota Nishizono, Naoki Saijo, and Makio Kashino)

VIDEO:

• GazeHelp Video Demo: Exploring practical gaze-assisted interactions for graphic design tools
(Ryan Lewien)

• Eye Tracking in Ocular Proton Therapy
(Riccardo Via, Giovanni Fattori, Alessia Pica, Antony Lomax, Guido Baroni, Damien Charles Weber, and Jan Hrbacek)

• Visual Analytics and Annotation of Pervasive Eye Tracking Video
(Kuno Kurzhals, Nils Rodrigues, Maurice Koch, Michael Stoll, Andres Bruhn, Andreas Bulling, and Daniel Weiskopf)

• Gaze-Adaptive Lenses for Feature-Rich Information Spaces
(Fabian Göbel, Kuno Kurzhals, Victor R. Schinazi, Peter Kiefer, and Martin Raubal)

Sponsor Webinar: iMotions (11:00 - 12:15)

iMotions logo


Measuring visual attention, emotional arousal, valence and cognitive processing with the iMotions Platform


Webinar Description: In this sponsor webinar, iMotions will present their software solution for multimodal data collection and signal processing for academic and commercial research on human behavior. Explore how to bring your research towards multi-modal in a single platform by synchronizing eye tracking data with physiological and emotional data from biosensor signals like facial expression analysis, electrocardiogram, electrodermal activity, and electroencephalography. The webinar will feature an in-depth, live demo of iMotions for streamlined data collection, stimulus presentation, and signal processing & analysis in transparent R notebooks, which highlight research applications such as cognitive science, screen-based attentional and reading tasks, and mobile glasses studies. Join the iMotions Discord channel or visit the iMotions sponsor booth to meet us and ask questions!

Full Papers I (12:00 - 13:30)

Methods I

Session sponsored by iMotions logo

Toward Eye-Tracked Sideline Concussion Assessment in eXtended Reality
(Anderson Schrader, Isabella Gebhart, Drew Garrison, Andrew Duchowski, Martian Lapadatescu, Weiyu Feng, Mahmoud Thabit, Fang Wang, Krzysztof Krejtz, Daniel D Petty)

Visual Search Target Inference in Natural Interaction Settings with Machine Learning
(Michael Barz. Sven Stauden, Daniel Sonntag)

Cognitive Load during Eye-typing
(Tanya Bafna, John Paulin Paulin Hansen, Per Bækgaard)

Important Considerations of Data Collection and Curation for Reliable Benchmarking of End-User Eye-Tracking Systems
(Iakov Chernyak, Grigory Chernyak, Jeffrey K. S. Bland, Pierre D. P. Rahier)

Label Likelihood Maximisation: Adapting iris segmentation models using domain adaptation
(Anton Mølbjerg Eskildsen, Dan Witzner Hansen)

13:30 - 14:30
Keynote I (Peter König) Session sponsored by iMotions

Full Papers II (14:45 - 16:15)

Gaze Analysis and Interaction

Session sponsored by Tobii Pro logo

Deep semantic gaze embedding and scanpath comparison for expertise classification during OPT viewing
(Nora Castner, Thomas C Kübler, Katharina Scheiter, Juliane Richter, Thérése Eder , Fabian Huettig, Constanze Keutel, Enkelejda Kasneci)

Analyzing Gaze Behavior Using Object Detection and Unsupervised Clustering
(Pranav Venuprasad, Li Xu, Enoch Huang, Andrew Gilman, Leanne Chukoskie, Pamela Cosman)

A MinHash approach for fast scanpath classification
(David Geisler, Nora Castner, Gjergji Kasneci, Enkelejda Kasneci)

Combining Gaze Estimation and Optical Flow for Pursuits Interaction
(Mihai Bâce, Vincent Becker, Chenyang Wang, Andreas Bulling)

A Survey of Digital Eye Strain in Gaze-Based Interactive Systems
(Teresa Hirzle, Maurice Cordts, Enrico Rukzio, Andreas Bulling)

16:30 - 17:15
17:15 - 18:00

Full Papers III (16:30 - 18:00)

Applications

Session sponsored by EYEVIDO logo

Eyes on URLs: Relating Visual Behavior to Safety Decisions (2020)
(Niveta Ramkumar, Vijay Kothari, Caitlin Mills, Ross Koppel, Jim Blythe, Sean Smith, Andrew L Kun)

Selection of Eye-Tracking Stimuli for Prediction by Sparsely Grouped Input Variables for Neural Networks: towards Biomarker Refinement for Autism (2020)
(Beibin Li, Erin Barney, Caitlin Hudac, Nicholas Nuechterlein, Pamela Ventola, Linda Shapiro, Frederick Shic)

Where Do Deep Fakes Look? Synthetic Face Detection via Gaze Tracking
(Ilke Demir, Umur Aybars Ciftci)

HGaze Typing: Head-Gesture Assisted Gaze Typing
(Wenxin Feng, Jiangnan Zou, Andrew Kurauchi, Carlos H Morimoto, Margrit Betke)

Crossed Eyes: Domain Adaptation for Gaze-Based Mind Wandering Models
(Robert Bixler, Sidney D'Mello)

Posters & Demos & Videos II (18:00 - 19:00)

POSTERS:

Faces strongly attract early fixations in naturally sampled real-world stimulus materials (2020)
(Anna Lisa Gert, Benedikt V. Ehinger, Tim C. Kietzmann, and Peter König)

Modelling of Blink-Related Eyelid-Induced Shunting on the Electrooculogram (2021)
(Nathaniel Barbara, Tracey A. Camilleri, and Kenneth P. Camilleri)

Semi-Supervised Learning for Eye Image Segmentation (2021)
(Aayush Kumar Chaudhary, Prashnna K Gyawali, Linwei Wang, and Jeff B Pelz)

EyeLogin - Calibration-free Authentication Method For Public Displays Using Eye Gaze (2021)
(Omair Shahzad Bhatti, Michael Barz, and Daniel Sonntag)


Benefits of temporal information for appearance-based gaze estimation (2020)
(Cristina Palmero Cantariño, Oleg Komogortsev, and Sachin S Talathi)


Towards gaze-based prediction of the intent-to-interact in virtual reality (2021)
(Brendan David-John, Candace Peacock, Ting Zhang, T. Scott Murdison, Hrvoje Benko, and Tanya R. Jonker)

Getting more out of Area of Interest (AOI) analysis with SPLOT (2020)
(Artem Belopolsky)

55 Rides: attention annotated head and gaze data during naturalistic driving (2021)
(Thomas C Kübler, Wolfgang Fuhl, Elena Wagner, and Enkelejda Kasneci)

Pinch, Click, or Dwell: Comparing Different Selection Techniques for Eye-Gaze-Based Pointing in Virtual Reality (2021)
(Aunnoy K Mutasim, Anil Ufuk Batmaz, and Wolfgang Stuerzlinger)

VIDEO:

• GazeHelp Video Demo: Exploring practical gaze-assisted interactions for graphic design tools
(Ryan Lewien)

• Eye Tracking in Ocular Proton Therapy
(Riccardo Via, Giovanni Fattori, Alessia Pica, Antony Lomax, Guido Baroni, Damien Charles Weber, and Jan Hrbacek)

• Visual Analytics and Annotation of Pervasive Eye Tracking Video
(Kuno Kurzhals, Nils Rodrigues, Maurice Koch, Michael Stoll, Andres Bruhn, Andreas Bulling, and Daniel Weiskopf)

• Gaze-Adaptive Lenses for Feature-Rich Information Spaces
(Fabian Göbel, Kuno Kurzhals, Victor R. Schinazi, Peter Kiefer, and Martin Raubal)

19:00 - 24:00
Social
26 May 2021
TIME
TRACK I
TRACK II

Posters & Demos & Videos III (11:00 - 12:00)

Session sponsored by Fove logo

POSTERS:

Pupillary response reflects vocabulary comprehension (2021)
(Takashi Hirata and Yutaka Hirata)

Voluntary Pupil Control in Noisy Environments (2020)
(Jan Ehlers and Annika Meinecke)

Polarized near-infrared light emission for eye gaze estimation (2020)
(Koki Koshikawa, Masato Sasaki, Takamasa Utsu, and Kentaro Takemura)

Impact of evoked reward expectation on ocular information during a controlled card game (2020)
(Minoru Nakayama and Kohei Shoda)

Eye Gaze Estimation using Imperceptible Marker Presented on High-Speed Display (2021)
(Kento Seida and Kentaro Takemura)

GazeLockPatterns: Comparing Authentication using Gaze and Touch for entering Lock Patterns (2020)
(Yasmeen Abdrabou, Ken Pfeuffer, Mohamed Khamis, and Florian Alt)

DEMO:

• Synopticon: Sensor Fusion for Automated Gaze Analysis
(Michael Hildebrandt, Jens-Patrick Langstrand, and Hoa Thi Nguyen)

• Demo: Implementing Eye-Tracking for Persona Analytics
(Soon-Gyo Jung, Joni Salminen, and Bernard Jansen)

VIDEO:

• Video: Automatic Recognition and Augmentation of Attended Objects in Real-time using Eye Tracking and a Head-mounted Display
(Michael Barz, Sebastian Kapp, Jochen Kuhn, and Daniel Sonntag)

• RIT-Eyes: realistically rendered eye images for eye-tracking applications
(Nitinraj Nair, Aayush Kumar Chaudhary, Rakshit Sunil Kothari, Gabriel Jacob Diaz, Jeff B Pelz, and Reynold Bailey)

12:00 - 12:45
12:45 - 13:30

Full Papers IV (12:00 - 13:30)

Gaze Input

Session sponsored by GazePoint logo

Gaze+Hold: Eyes-only Direct Manipulation with Continuous Gaze Modulated by Closure of One Eye
(Argenis Ramirez Gomez, Christopher Clarke, Ludwig Sidenmark, Hans Gellersen)

GazeMeter: Exploring the Usage of Gaze Behaviour to enhance Password Assessments
(Yasmeen Abdrabou, Ahmed Shams, Mohamed Omar Mantawy, Anam Ahmad Khan, Mohamed Khamis, Florian Alt, Yomna Abdelrahman)

BimodalGaze: Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement
(Ludwig Sidenmark, Diako Mardanbegi, Argenis Ramirez Gomez, Christopher Clarke, Hans Gellersen)

Bubble Gaze Cursor + Bubble Gaze Lens: Applying Area Cursor Technique to Eye-gaze Interface
(Myungguen Choi, Daisuke Sakamoto, Tetsuo Ono)

Eye Gaze Controlled Robotic Arm for Persons with Severe Speech and Motor Impairment
(Vinay Krishna Sharma, Kamalpreet Saluja, Vimal Mollyn, Pradipta Biswas)

13:30 - 14:30

Full Papers V (14:45 - 16:15)

Visualization and Annotation

Session sponsored by blickshift logo

Gaze-Adaptive Lenses for Feature-Rich Information Spaces
(Fabian Göbel, Kuno Kurzhals, Victor R. Schinazi, Peter Kiefer, Martin Raubal)

Visual Analytics and Annotation of Pervasive Eye Tracking Video
(Kuno Kurzhals, Nils Rodrigues, Maurice Koch, Michael Stoll, Andrés Bruhn, Andreas Bulling, Daniel Weiskopf)

The Power of Linked Eye Movement Data Visualizations
(Michael Burch, Günter Wallner, Nick Broeks, Lulof Piree, Nynke Boonstra, Paul Vlaswinkel, Silke Franken, Vince van Wijk)

Image-Based Projection Labeling for Mobile Eye Tracking
(Kuno Kurzhals)

Neural Networks for Semantic Gaze Analysis in XR Settings
(Lena Stubbemann, Dominik Dürrschnabel, Robert Refflinghaus)

16:30 - 17:15
17:15 - 18:00
Full Papers VI
18:00 - 18:15
18:15 - 19:00

Full Papers VI (16:30 - 18:00)

Methods II

Session sponsored by Facebook Reality Labs logo

Effect of a Constant Camera Rotation on the Visibility of Transsaccadic Camera Shifts
(Maryam Keyvanara, Robert Allison)

Dataset for Eye Tracking on a Virtual Reality Platform
(Stephan Joachim Garbin, Oleg Komogortsev, Robert Cavin, Gregory Hughes, Yiru Shen, Immo Schuetz, Sachin S Talathi)

Validation of a prototype hybrid eye-tracker against the DPI and the Tobii Spectrum
(Kenneth Holmqvist, Saga Lee Örbom, Michael Miller, Albert Kashchenevsky, Mark M. Shovman, Mark W. Greenlee)

Analysis of iris obfuscation: Generalising eye information processes for privacy studies in eye tracking
(Anton Mølbjerg Eskildsen, Dan Witzner Hansen)

Sponsor Webinar: Tobii Pro (17:15 - 18:15)

Tobii Pro logo


Tobii: Celebrating 20 years of revolutionizing eye tracking research and technology


Webinar Description: This workshop will present Tobii Pro's innovative solutions for conducting research within computer-based tasks and through wearable applications. We will discuss the newest options in our portfolio, from hardware, software features & functionality, services, and partnerships with biometrics for data integration.

We will provide a special review of our two metrics tests: laboratory-based metrics, and a first of its kind extensive accuracy and precision field metrics test. The field test spanned over 400 participants, 2 countries, and contained numerous participant variables like eye color, eye shape, and vision correction. Our field testing is a complementary report to our laboratory metrics tests.

We will be joined by an expert panel of engineers & cognitive scientists from our team to answer questions and have a discussion of where we've been in the past, and where eye tracking will go into the future.

We welcome questions ahead of the presentation! Please email marisa.biondi@tobii.com

Posters & Demos & Videos IV (18:00 - 19:00)

POSTERS:

OpenNEEDS: A Dataset of Gaze, Head, Hand, and Scene Signals During Exploration in Open-Ended VR Environments (2021)
(Kara J Emery, Marina Zannoli, James Warren, Lei Xiao, and Sachin S Talathi)

Gaze+Lip: Rapid, Expressive interactions Combining Gaze Input and Silent Speech Commands for Hands-free Smart TV Control (2021)
(Zixiong Su, Naoki Kimura, Xinlei Zhang, and Jun Rekimoto)

Understanding Urban Devotion through the Eyes of an Observer (2021)
(Margarita Vinnikov, Kian Motahari, Louis I. Hamilton, and Burcak Ozludil Altin)

Estimating Point-of-Gaze using Smooth Pursuit Eye Movements without Implicit and Explicit User-Calibration (2020)
(Yuto Tamura and Kentaro Takemura)

Enhancing the precision of remote eye-tracking using iris velocity estimation (2021)
(Aayush Kumar Chaudhary and Jeff B Pelz)

A Multimodal Eye Movement Dataset and a Multimodal Eye Movement Segmentation Analysis (2021)
(Wolfgang Fuhl and Enkelejda Kasneci)

Positional head-eye tracking outside the lab: an open-source solution (2020)
(Peter Hausamann, Christian Sinnott, and Paul MacNeilage)

Visualizing Prediction Correctness of Eye Tracking Classifiers (2021)
(Martin H.U. Prinzler, Christoph Schröder, Sahar Mahdie Klim Al Zaidawi, Gabriel Zachmann, and Sebastian Maneth)

Understanding Game Roles and Strategy Using a Mixed Methods Approach (2021)
(Kaitlyn M Roose and Elizabeth S. Veinott)

DEMO:

• Synopticon: Sensor Fusion for Automated Gaze Analysis
(Michael Hildebrandt, Jens-Patrick Langstrand, and Hoa Thi Nguyen)

• Demo: Implementing Eye-Tracking for Persona Analytics
(Soon-Gyo Jung, Joni Salminen, and Bernard Jansen)

VIDEO:

• Video: Automatic Recognition and Augmentation of Attended Objects in Real-time using Eye Tracking and a Head-mounted Display
(Michael Barz, Sebastian Kapp, Jochen Kuhn, and Daniel Sonntag)

• RIT-Eyes: realistically rendered eye images for eye-tracking applications
(Nitinraj Nair, Aayush Kumar Chaudhary, Rakshit Sunil Kothari, Gabriel Jacob Diaz, Jeff B Pelz, and Reynold Bailey)

19:00 - 24:00
Social
27 May 2021
TIME
09:00 - 10:00
10:00 - 11:00
11:00 - 12:00
12:00 - 12:30
12:30 - 13:00
TRACK I
13:30 - 14:30
14:30 - 15:00
Closing & TownHall Meeting
15:00 - 15:45
15:45 - 16:30
16:30 - 17:00
17:00 - 17:30
17:30 - 18:30
18:30 - 19:30
19:30 - 20:00
20:00 - 23:20