Welcome to the 4th Workshop on Eye Tracking and Visualization 2019

There is a growing interest in eye tracking as a research method in many communities, including information visualization, scientific visualization, visual analytics, but also in human-computer interaction, applied perception, psychology, cognitive science, security, and mixed reality. Progress in hardware technology and the reduction of costs for eye tracking devices have made this analysis technique accessible to a large population of researchers. Recording the observer’s gaze can reveal how dynamic graphical displays are visually accessed and which information are processed in real time. Nonetheless, standardized practices for technical implementations and data interpretation remain unresolved. With this fourth Workshop on Eye Tracking and Visualization (ETVIS), we intend to follow-up on the highly successful first, second, and third ETVIS Workshops (at IEEE VIS 2015 and 2016 and collocated with ETRA 2018) and continue to build a community of eye tracking researchers within the visualization community, covering information visualization, scientific visualization, and visual analytics. We also aim to establish connections to related fields, in particular, in human-computer interaction, cognitive science, and psychology. This will promote a robust exchange of established practices and innovative use scenarios

Scope and Focus

Technological advances in computer vision algorithms and sensor hardware have greatly reduced the implementational and financial costs of eye tracking. Thus, it is unsurprising to witness a significant increase in its use as a research tool in fields beyond the traditional domains of biological vision, psychology, and neuroscience, in particular, in visualization and human-computer interaction research. One of the key challenges lies in the analysis, interaction, and visualization of complex spatio-temporal datasets of gaze behavior, which is further complicated by complementary datasets such as semantic labels, user interactions and/or accompanying physiological sensor recordings. Ultimately, the research objective is to allow eye tracking data to be effectively interpreted in terms of the observer’s decisionmaking and cognitive processes. To achieve this, it is necessary to draw upon our current understanding of gaze-behavior across various and related fields, from vision and cognition to visualization.

The technical and financial aspects of recording of eye movement data are not a big issue anymore—with low-cost eye tracking devices being widely available. We have seen a large increase in research and papers related to eye tracking. However, the analysis, interaction, and visualization of such gaze data—along with additionally attached data from the stimulus or further physiological sensor recordings—becomes a challenging factor in this emerging discipline. Also, from the human-computer interaction and the cognitive science perspective, many aspects have to be focused on integrating the human behavior and the decision-making and thinking processes. All together make eye tracking an important field to be understood, be it in the sense of data analysis and visualization, interaction, or user-based evaluation of visualization.


This workshop will cover topics that are related to visualization research (including information visualization, scientific visualization, and visual analytics) and eye tracking. Aspects discussed in this workshop include the following topics with an emphasis on the relationship between eye tracking and visualization:

  • Visualization and visual analytics techniques for eye movement data
  • Visual gaze and eye movement data analysis, including visual data mining, aggregation, clustering techniques, and metrics for eye movement data
  • Eye movement data provenance, big eye movement data
  • Uncertainty visualization of gaze data
  • Standardized metrics for evaluating interactions with visualization
  • Novel methods for eye-tracking in challenging visualization scenarios
  • Interactive annotation of gaze and stimulus data
  • Systems for visual exploration of eye movement data
  • Reports of eye tracking studies evaluating visualization or visual analytics
  • Eye tracking in non-WIMP visualization environments, including mobile eye tracking, mobile devices, virtual environments, mixed reality, and large displays
  • Eye tracking-based interaction techniques for visualization
  • Interpreting eye movement scanpaths from the perspective of human cognitive architecture and perceptuo-motor expertise
  • Perception in eye tracking studies
  • Inferences that can be drawn from gaze behavior
  • Cognitive models for inferring user states from gaze behavior with visualizations
  • Applications that rely on eye-tracking as an adaptive input parameter


Authors are invited to submit original work complying with the ETRA NOTES format (up to 4 pages + 2 pages references). Papers should be submitted electronically in PDF format to ETVIS over the ETRA submission system:

Also ensure that the Author Guidelines ( , for SIG sponsored events [sigconf]) are met prior to submission.

All accepted papers will be published by ACM as part of the ETRA proceedings.


ETVIS Important Dates
January 2, 2019 Paper abstracts due
January 9, 2019 Full papers due
February 15, 2019 1st round notifications
February 22, 2019 Revised paper due
March 6, 2019 Final notifications to authors
March 20, 2019 Camera ready papers due


Michael Burch (, Eindhoven University of Technology, Netherlands
Pawel Kasprowski (, Silesian University of Technology, Poland
Leslie Blaha (, Air Force Research Laboratory, USA

Social Media Chair

Ayush Kumar (, Stony Brook University, USA

Accepted Papers

Long Papers

Visually Comparing Eye Movements over Space and Time
Ayush Kumar(Stony Brook University);Michael Burch(Eindhoven University of Technology);Klaus Mueller(Stony Brook University)

Clustered Eye Movement Similarity Matrices
Ayush Kumar(Stony Brook University);Neil Timmermans(Eindhoven University of Technology);Michael Burch(Eindhoven University of Technology);Klaus Mueller(Stony Brook University)

Short Papers

Finding the Outliers in Scanpath Data
Michael Burch(Eindhoven University of Technology);Ayush Kumar(Stony Brook University);Klaus Mueller(Stony Brook University);Titus Kervezee(Eindhoven University of Technology);Wouter Nuijten(Eindhoven University of Technology);Rens Oostenbach(Eindhoven University of Technology);Lucas Peeters(Eindhoven University of Technology);Gijs Smit(Eindhoven University of Technology);

Using Warped Time Distance Chart to Compare Scan-paths of Multiple Observers
Pawel Kasprowski(Silesian University of Technology);Katarzyna Harezlak(Silesian University of Technology);

An Intuitive Visualization for Rapid Data Analysis - Using the DNA Metaphor for Eye Movement Patterns
Fabian Deitelhoff(University of Applied Sciences and Arts Dortmund);Andreas Harrer(University of Applied Sciences and Arts Dortmund);Andrea Kienle(University of Applied Sciences and Arts Dortmund);

Iris: A Tool for Designing Contextually Relevant Gaze Visualizations
Sarah D'Angelo(Northwestern University);Jeff Brewer(Northwestern University);Darren Gergle(Northwestern University);

Art Facing Science: Artistic Heuristics for Face Detection
Andrew Duchowski(Clemson University);Nina Gehrer(University of Tübingen);Michael Schoenenberg(University of Tübingen);Krzysztof Krejtz(SWPS University of Social Sciences & Humanities, Poland)


If you have any questions, would like to give feedback, are interested to help and support us with our workshop please email to Michael Burch, Eindhoven University of Technology,