Accepted Tutorials

This tutorial covers all aspects of an eye-tracking study from experimental design to data analysis. As an example, we present an eye-tracking study that investigates reading code vs. reading text. First, we give an introduction to experimental design and the preparation of an eye-tracking study using PsychoPy Builder. During a hands-on session, participants will collect eye-movement data in teams using Gazepoint eye trackers that will be provided. Finally, the tutorial presents details of a Python-based gaze analytics pipeline used to extract raw eye movement data, detect fixations via velocity-based filtering, collate data for statistical evaluation, analyze and visualize results using R. Attendees of the tutorial will have an opportunity to run the scripts of an analysis of gaze data collected from an example study. The tutorial covers basic eye movement analytics, e.g., fixation count and dwell time within AOIs, as well as advanced analysis using gaze transition entropy.
Presenters:
Andrew Duchowski, Clemson University, duchowski@clemson.edu
Andrew Duchowski is a professor of Computer Science at Clemson University. He received his baccalaureate (1990) from Simon Fraser University, Burnaby, Canada, and doctorate (1997) from Texas A&M University, College Station, TX, both in Computer Science. His research and teaching interests include visual attention and perception, eye tracking, computer vision, and computer graphics. He joined the School of Computing faculty at Clemson in January, 1998. He is a noted research leader in the field of eye tracking, having produced a corpus of papers and a textbook related to eye tracking research, and delivered courses and seminars on the subject at international conferences. He developed and maintains the eye tracking laboratory at Clemson University, and teaches a regular course on eye tracking methodology attracting students from a variety of disciplines across campus.
Nina Gehrer, University of Tubingen, nina.gehrer@uni-tuebingen.de
Nina Gehrer is is a postdoctoral researcher at the Department of Clinical Psychology and Psychotherapy at the University of Tübingen, Germany. She received her MSc in 2015 and completed her PhD in 2020. Her primary research interest focuses on biases in (social) information processing associated with psychological disorders (e.g., antisocial personality disorder, attention deficit hyperactivity disorder, eating disorders, etc.). In this context, she designed and conducted several eye-tracking studies over the last years. She gives courses in clinical psychology at the University of Tübingen and was co-presenter of tutorials on experimental design and gaze analytics at the ACM Symposium on Eye Tracking Research & Applications (in 2018, 2019, 2021, and 2022).
Krzysztof Krejtz, SWPS University of Social Sciences and Humanities, Poland, kkrejtz@swps.edu.pl
Krzysztof Krejtz is a psychologist at SWPS University of Social Sciences and Humanities in Warsaw, Poland, where he is leading the Eye Tracking Research Center. In 2017 he was a guest professor at Ulm University, in Ulm, Germany. He gave several invited talks at e.g., Max-Planck Institute (Germany), Bergen University (Norway), and Lincoln University Nebraska (USA). He has extensive experience in social and cognitive psychology research methods and statistics. In his research, he focuses on the use of eye tracking method and developing a second-order eye data-based metrics that may capture the dynamics of attention and information processing processes (transitions matrices entropy, ambient-focal coefficient K), dynamics of attention process in the context of Human-Computer Interaction, multimedia learning, media user experience, and accessibility. He is a member of the ACM Symposium on Eye Tracking Research and Application (ACM ETRA) Steering Committee.
Tentative Schedule:
  1. Introduction
  2. Experimental design
    • starting point: the question you want to answer
    • hypotheses
    • operationalization
    • experimental plan
    • confounding variables
    • sample
    • data collection
    • data analysis
    • conclusions
    • est. 60 min
  3. Examples: Experimental design of a simple mock eye tracking study
    • selecting stimuli
    • setting up PsychoPy
    • capturing data
    • looking at the data using HDFView
    • extracting the data
    • est. 60 min
  4. Preparations for using the Gaze analytics pipeline
    • file system structure
    • eye movement data and source code separation
    • eye movement data format, sampling rate
    • stimulus (e.g., images)
    • AOI definition using Scribus
    • est. 40 min
  5. Gaze analytics pipeline overview
    • Linux/Windows distinctions, e.g., Makefile vs. batch files
    • the gaze analytics pipeline: dirs, raw, process, collate, stats
    • raw data extraction, velocity-based filtering, collation
    • est. 80 min
  6. Traditional gaze analytics
    • statistics, plots
    • fixation count within AOIs
    • dwell time within AOIs
    • frequency of the first fixation on AOIs
    • number of transitions between AOIs
    • and other indicators depending on audience interest and time permitting
    • est. 60 min
  7. Advanced gaze analytics (time permitting)
    • introducing transition matrices and transition entropy
    • ambient/focal K coefficient
    • est. 60 min


We will give participants the opportunity to learn about new ways of eye tracking analysis together with additional synchronised data channels like GSR, EKG, EEG, NIRS, Facereading, Voiceanalysis, etc. and AI tools available for automatic object detection. Participants will then be able to run test experiments themselves and explore the different possibilities hands-on. All data will be recorded with triggers for synchronisation to ensure high timing accuracy, we will openly discuss the details of the methodology (no black box approach) for an in-depth understanding of how these different types of systems work and how one can analyse and visualise the data together after synchronisation. Our modular approach makes it easy to include different individual specialised open-source tools for various fields of research, participants are welcome to discuss any special solutions they are interested in and which could be included as another module in the overall analysis framework.
Presenters:
Achim Hornecker, BiSigma, ahorn@bisigma.de
Achim has a PhD in Mathematics from University of Freiburg and did develop the Brain Vision Analyser while working at Brain Products. Together with a team of software developers and data analysts, Achim now develops and implements cross-sector and cross-tool methods for the collection, evaluation and visualisation of data. Based on this experience, he also looks at the topic of big data analytics. His focus is not on databases or software manufacturers, but on the questions that need to be answered with the help of collected data. Only when the methods for extracting information from data are known does it make sense to ask which software and tools can be used for this purpose.
Juergen Bluhm, Baden-Wuertemberg Cooperative State University, juergen.bluhm@heilbronn.dhbw.de
Juergen is an academic researcher at DHBW Heilbronn and also a lecturer for various other academic institutions in Marketing, Market Research, Implicit Research, Business Statistics and Business Organisation and Project Planning.
Specialized in brand image research, with responsibilities for advertising tracking, brand tracking, advertising pre-testing, new-product research, financial research and advertising post-tests, he has more than 30 years of eye-tracking experience, including virtual shopping research (did develop a truly integrated virtual shopping with eye-tracking methodology) as well as biometric synchronisation of eye tracking with other biometric measurements like EEG, GSR, NIRS, etc., including integration in VR/AR.
Tentative Schedule:
  1. 9:00-10:00       presentation of methodology
  2. 10:00-10:15      coffee break
  3. 10:15-13:00      hands-on data collection
  4. 13:00-14:00      lunch break
  5. 14:00-15:30      group work to analyse the data - analysis of each sensor separately
  6. 15:30-15:45      coffee break
  7. 15:45-17:00      group work to analyse the data – analysis of all sensors combined, including AI based object detection for dynamic stimuli
  8. 17:00-17:30      discussion of the results and special solutions modules

The tutorial offers two distinct but related foci:

Tentative Schedule:
  1. Insights into eye movements and color vision deficiencies (CVDs)
  2. Image processing and computer vision techniques for CVD visual perception.

Researchers and representatives from Industry will provide insights into experimental relations between eye movements and CVDs. Therefore, in light of recent progress in the eye-tracking calibration process, infrared and webcam-based eye-trackers settings will be discussed, as well as their employment in diagnosing CVDs.
Furthermore, established and experimental techniques to improve color perception for people with CVD will be presented. They will span low-level image processing, color mapping, saliency-based approaches, and application-oriented studies such as cartography. Current trends and techniques, along with demo sessions, will be presented throughout the tutorial.
Last, a round table session is set to gather ideas, discuss performances and pave the way to potential collaborations.
Presenters:
Alessandro Bruno, International University of Language and Media, Milan (Italy)
Alessandro works with the Faculty of Communication at IULM University (Milan, Italy). He leads the following units: ‘data mining and text analytics’ and ‘artificial intelligence laboratory’ within the Programme of Artificial Intelligence for Business and Society. He is a computer engineer doctor, covering several postdoc positions in Italy (University of Palermo, INAF - Italian National Institute for Astrophysics) and the UK (Research Associate at NCCA - National Centre for Computer Animation and Bournemouth University’s Department of Computing) since 2012 through 2020. Alessandro was a research visitor at UCL (University College London) in 2019. In 2021, he was a Lecturer in Computing at Bournemouth University, where he served until May 2022. He was also a Co-Investigator of a European Research Project, namely, s4allcities. In 2021, Alessandro was awarded the best paper prize at the ICIAP (International Conference on Image Analysis and Processing) workshop HBAxSCES (Human Behaviour Analysis for Smart City Safety Environment). Since 2011, he has worked on visual attention-related topics such as visual saliency, infrared and webcam-based eye-tracking, image enhancement for color perception for people with CVD (Color Vision Deficiency), and webcam-based eye-tracking.
Arzu Çöltekin, University of Applied Sciences & Arts Northwestern Switzerland, Brugg-Windisch, Switzerland
Arzu leads the Institute of Interactive Technologies at the University of Applied Sciences and Arts Northwestern Switzerland, and works at the same institute as a Professor of Human-Computer Interaction, Visualization and Extended Reality. She is also a research affiliate on scientific data analysis and visualization topics at the Seamless Astronomy group in the Harvard-Smithsonian Center for Astrophysics of the Harvard University in Cambridge, USA. Arzu chairs the international Extended Reality and Visual Analytics working group within the ISPRS; co-chairs the Commission on Visual Analytics with the ICA, and serves as a council member with the International Society of Digital Earth. Arzu's interests in eye tracking includes using it as an input modality in interactive systems (adaptive, foveated displays) as well as eye movement analysis for cognitive information processing. She has done various projects on color and cartography and will share some of her experience at the intersection of the two.
Tentative Schedule:
Session 1: Eye movements and color vision - state of the art
  1. 20 minutes      Openings and Introduction to scenarios of interest.
    Organisers
  2. 20 minutes      Diagnosing color vision deficiencies with eye movements: A tutorial on a simple and effective objective test.
    A. Traore - PhD student - Vision Science at Auckland University | Founder of Eye inc.
  3. 20 minutes      Saliency-based image enhancement for people with CVD.
    A. Bruno - IULM AI Lab - IULM University, Milan
  4. 15 minutes      Break
  5. 20 minutes      Addressing color deficiency: Solutions from cartography
    Prof Arzu Coltekin, University of Applied Sciences & Arts Northwestern Switzerland
  6. 20 minutes      Extending the description of color blindness
    Prof Alessandro Rizzi, University of Milan
  7. 20 minutes      Insights into Webcam-based eye-tracking for a color blindness study
    Prof Aladine Chetouani, University of Orleans
  8. 15 minutes      Break

Session 2: Unconference
  1. 20 minutes      Webcam eye-tracking? Calibration-based and calibration-free approaches, challenges for color-related studies
    Input by AdAdam Cellary (CEO of RealEye)
  2. 30 minutes      Participatory round table, board exercise with all participants: Webcam eye-tracking experiences, expertise, research gaps, open discussion
    All participants
  3. 10 minutes      Conclusions
    Organisers