Tutorials at a Glance
Thursday, June 14, 2018 | |||
Tutorials Track 1 (Room S202) | Tutorials Track 2 (Room S203) | Tutorials Track 3 (Room S226) | Tutorials Track 4 (Room S227) |
9:00-10:00 | |||
Tutorial 1 - Mehul Bhatt, Jakob Suchan | Tutorial 2 - Edward Ryklin | Tutorial 4 - Tanja Blascheck, Michael Raschke, Michael Burch | Tutorial 5 - Diako Mardanbegi, Thies Pfeiffer |
10:00-10:30 | |||
Coffee Break (Room S303) | |||
10:30-12:30 | |||
Tutorial 1 - Mehul Bhatt, Jakob Suchan | Tutorial 2 - Edward Ryklin | Tutorial 4 - Tanja Blascheck, Michael Raschke, Michael Burch | Tutorial 5 - Diako Mardanbegi, Thies Pfeiffer |
12:30-13:30 | |||
Lunch Break (Room S303) | |||
13:30-15:30 | |||
Tutorial 3 - Andrew Duchowski, Nina Gehrer | Tutorial 4 - Tanja Blascheck, Michael Raschke, Michael Burch | Tutorial 5 - Diako Mardanbegi, Thies Pfeiffer | |
15:30-16:00 | |||
Coffee Break (Room S303) | |||
16:00-17:00 | |||
Tutorial 3 - Andrew Duchowski, Nina Gehrer | Tutorial 4 - Tanja Blascheck, Michael Raschke, Michael Burch | Tutorial 5 - Diako Mardanbegi, Thies Pfeiffer |
Tutorials
Spatial Cognition in the Wild — Methods for Large-Scale Behavioural Research in Visuo-Locomotive Perception
(Half Day)


Abstract. The tutorial on Spatial Cognition in the Wild presents an interdisciplinary perspective on conducting evidence-based human behaviour research from the viewpoints of spatial cognition and computation, environmental psychology, and visual perception. The tutorial emphasises the semantic interpretation of multimodal behavioural data, and the (empirically-driven) synthesis of embodied interactive experiences in real world settings. Of special focus are: visual (e.g., perception, attention based on eye-tracking), visuolocomotive (e.g., movement, indoor wayfinding), and visuo-auditory (e.g., moving images) cognitive experiences in the context of areas such as architecture & built environment design, narrative media design, product design, cognitive media studies (e.g., film, animation, immersive reality). The technical focus of the tutorial is on demonstrating general computational methods, tools, and cognitive assistive technologies that can be used for multi-modal human behaviour studies in visual, visuo-locomotive, and visuo-auditory perception. Presented methods are rooted in foundational research in artificial intelligence, spatial informatics, and human-computer interaction. The tutorial utilises case-studies from large-scale experiments in domains such as evidence-based architecture design, communication and media studies, and cognitive film studies to demonstrate the application of the foundational practical methods and tools.
Scope and Audience. Scope. Interdisciplinary scientific agenda targeting an audience with an interest or curiosity in visual and spatial cognition, visual perception, and artificial intelligence (emphasis on knowledge representation and reasoning, and high-level event perception). Particular focus will be utilising case-studies to demonstrate the state of the art in artificial intelligence, cognitive vision, and applied perception with respect to their impact on eye-tracking in particular, and multi-modal human behavioural research in general.
Audience. (1) Developers of basic eye-tracking methodologies interested in synergies with general artificial intelligence based methods / tools for multimodal human behavior studies; (2) Young researchers (e.g., masters and early stage doctoral candidates) desirous of exploring open research questions and avenues for applications of eye-tracking in domains such as (building) architecture design, (visuo-auditory) narrative media design, human-robot interaction; (3) Design practitioners from areas such as architecture, animation, visual art, digital media, interaction design seeking to get insights from existing case-studies involving eye-tracking in their respective domains of application.
Bio. Mehul Bhatt is Professor within the School of Science and Technology at Örebro University (Sweden), and Professor at the Department of Computer Science, University of Bremen (Germany). His research interests lie at the intersection of artificial intelligence, cognitive science, and HCI with a focus on visual and spatial cognition, knowledge representation and reasoning, design cognition, and multimodality.
Jakob Suchan is research assistant within the Human-Centered Cognitive Assistance Lab at the Department of Computer Science, University of Bremen. His research focuses on the integration of vision and KR from the viewpoint of computational cognitive systems where integrated (embodied) perception and interaction are involved (www.cognitive-vision.org).
Eye-Tracking and Visual Analytics
(Full Day)



Slides.
Introduction
Visual Analytics
Tools
Blickshift
Abstract. Eye tracking has become a widely used method to analyze human behavior in marketing, neuroscience, human-computer interaction, perception and cognition research, as well as visualization. Apart from measuring completion times and recording accuracy rates of correctly given answers during the performance of visual tasks in classical controlled experiments, eye tracking-based evaluations provide additional information on how visual attention is distributed and how it changes for a presented stimulus. Due to the wide field of applications of eye tracking and various kinds of research questions, different approaches have been developed to analyze eye movement data such as statistical algorithms (either descriptive or inferential), string editing algorithms, visualization-related techniques, and visual analytics techniques. Regardless of whether statistical or visual methods are used for eye movement data analysis, a large amount of data generated during eye tracking experiments has to be handled.
Where statistical analysis mainly provides quantitative results, visualization techniques allow researchers to analyze different levels and aspects of the recorded eye movement data in an explorative and qualitative way. When visualization techniques are not able to handle the large amount of eye movement data, the emerging discipline of visual analytics can be an option for exploratory data analysis. Machine-based analysis techniques such as methods from data mining or knowledge discovery in databases are combined with interactive visualizations and the perceptual abilities of a human viewer. Due to the increasing complexity of tasks and stimuli in eye tracking experiments, we believe that visual analytics approaches will play an increasingly important role in future eye tracking research. However, researchers are still missing sophisticated tools for an analysis of eye movement data using visual analytics approaches.
In this tutorial, we will present an overview of visual analytics approaches for eye movement data. We also demonstrate the analysis of eye movement data using Blickshift Analytics, a visual analytics software, which includes different eye tracking visualizations, pattern search techniques, and statistical method.
Scope and Audience. The tutorial is designed for the general audience of ETRA, and is especially suitable for PhD students. Eye tracking and visual analytics are becoming a new approach and research area. PhD students and researchers will benefit from this tutorial by finding a new and exciting topic and broaden their research horizon. The tutorial is an introduction into the field of visual analytics, how it is applied to eye movement data, and includes a hands-on tutorial for the software Blickshift Analytics. The tutorial requires a minimal level of pre-requisites. Fundamental concepts of eye tracking and visual analytics will be explained during the tutorial.
Bio. Tanja Blascheck is a Postdoctoral Fellow in the Analysis and Visualization (Aviz) group at Inria Université Paris Saclay in France. She is especially interested in combining eye movement data with interaction logs and think-aloud data to evaluate interactive visual analytics systems and creating novel visual analysis methods for these data sources. She received a PhD in Computer Science from the University of Stuttgart (tanja.blascheck@inria.fr).
Michael Raschke is co-founder and CEO of Blickshift. From 2009 to 2015 he was researcher at the Institute for Visualization and Interactive Systems at the University of Stuttgart. His research topics have been new methods and technologies for the evaluation of visualizations. Michael is convinced that the key for new applications of eye tracking are powerful methods for the analysis of eye movements (michael.raschke@blickshift.de).
Michael Burch is an Assistant Professor for Visual Analytics at Eindhoven University of Technology. His main research interests are in information visualization, software visualization, and visual analytics, typically in the application fields of software engineering, social networking, biology, and eye tracking. Moreover, user evaluations, in particular, with and without eye tracking play a crucial role in his research (m.burch@tue.nl).
Eye movement metrics: event detection
(Half Day)

Abstract. Event detection, is an end point critical to eye movement applications. Be it gaze interaction or passive attention analysis, software needs to be aware of the user's instantaneous state. Typically, eye movement metrics such as fixation duration (dwell time) modulates a behavioral event; i.e. actuate a button press or similar response that indicate a decision has been made.
I will describe numerous metrics, how they might be derived, and then used to score various behavioral events. Included in the tutorial will be a discussion on the significance of spatial calibration, and techniques to accommodate lack thereof. Also the significance of temporal frequency, i.e. sample rate to derive certain eye metrics and how interaction with temporal artifact filters can delay event detection and impede system performance.
Scope and Audience. Scope: saccades, fixations, pursuits. Time series analysis. Velocity: temporal characteristics.
Bio. Edward Ryklin is an American research scientist with 15 years experience working with eye tracking technology used for Neuroscientific applications related to Vision, Memory, and Decision Making.
Gaze Analytics Pipeline
(Half Day)


Abstract. This tutorial presents details of a Python-based gaze analytics pipeline developed and used by Prof. Duchowski and Ms. Gehrer. The gaze analytics pipeline consists of Python scripts for extraction of raw eye movement data, analysis and event detection via velocity-based filtering, collation of events for statistical evaluation, analysis and visualization of results using R. The tutorial is couched in analysis of gaze data collected during discrimination of different emotional expressions while viewing faces. The tutorial covers basic eye movement analytics, e.g., fixation count and dwell time within AOIs, as well as advanced analysis using gaze transition entropy. Newer analytical tools and techniques such as microsaccade detection and the Index of Pupillary Activity will be covered with time permitting.
Scope and Audience. The tutorial welcomes attendees at all levels of experience and expertise, from those just beginning to study eye movements to those well practiced in the profession who might wish to consider adopting use of Python and R scripts, possibly wishing to contribute to, expand on, and improve the pipeline.
Bio. Dr. Duchowski is a professor of Computer Science at Clemson University. He received his baccalaureate (1990) from Simon Fraser University, Burnaby, Canada, and doctorate (1997) from Texas A&M University, College Station, TX, both in Computer Science. His research and teaching interests include visual attention and perception, eye tracking, computer vision, and computer graphics. He is a noted research leader in the field of eye tracking, having produced a corpus of papers and a monograph related to eye tracking research, and has delivered courses and seminars on the subject at international conferences. He maintains Clemson's eye tracking laboratory, and teaches a regular course on eye tracking methodology attracting students from a variety of disciplines across campus.
Nina Gehrer is a clinical psychologist who is currently working on her PhD thesis at the University of Tübingen, Germany, since she received her master's degree in 2015. Her main research interest lies in studying face and emotion processing using eye tracking and a preferably wide range of analytic methods. As a clinical psychologist, she is particularly interested in possible alterations related to psychological disorders that could underlie associated deficits in social information processing. She began working with Prof. Duchowski in 2016. Since then, they have enhanced and implemented his gaze analytics pipeline in the analysis of several eye tracking studies involving face and emotion processing.
Eyetracking in Virtual and Augmented Reality
(Full Day)


Abstract. Virtual and augmented reality (VR/AR) are highly attractive application areas for eye tracking. Virtual environments have the potential to revolutionize life-sized interactive experimentation under highly-controlled conditions. VR simulations are thus attractive for many fields of research, such as marketing research, human-computer interaction, robotics or psychology. As virtual and augmented reality today primarily address the visual modality, they are well suited for gaze-based interaction, either for direct control or to realize attention aware interactions.
The tutorial provides hands-on experiences with several hardware combinations of VR/AR (e.g. HTC Vive, Microsoft HoloLens) and eye tracking systems (e.g. Pupil, Tobii). In the first half of the tutorial (morning session), an introduction about eye tracking in VR/AR, its potential applications and current solutions will be given. Participants will also get familiar with basic setups of an eye tracking project in Unity3D. In the afternoon session, the focus will be on how gaze can be mapped to 3D objects, how 3D attention can be analyzed and, how gaze can be used to interact with the virtual world. Finally, there will be some hands-on activities where participants can experiment with eyetracking VR/AR setups and do some example projects in Unity3D on both gaze interaction and attention analysis.
As software framework, Unity3D will be used to implement the virtual environments. Basic knowledge of Unity3D is required to get the most out of the hands-on part. The example project and assets will be provided to all participants, which allows for an easy implementation of VR/AR projects with eye tracking.
Scope and Audience. People with a high interest in using VR/AR in their research; basic knowledge of Unity3D or any other 3D programming environment is expected.
Bio. Dr. Thies Pfeiffer is a computer scientist and technical director of the virtual reality laboratory at the Center of Excellence Cognitive Interaction Technology at Bielefeld University, Germany. He has been working in the fields of eye tracking since 2003, starting in psycholinguistics and since then migrating more and more towards human-computer-interaction with eye gaze. One particular research interest is the analysis and interaction of eye gaze in 3D environments.
Dr. Diako Mardanbegi is research associate in the Computing and Communications Department of Lancaster University. He is working in the MODEM project (Monitoring of Dementia using eye movements) aiming to advance knowledge of the relationship of eye movement and dementia. He received his PhD in Information Technology from IT University of Copenhagen at 2013 under supervision of Dan Witzner Hansen. In 2012 he worked at the University of Sao Paulo as a visiting researcher under the supervision of Carlos Morimoto.