Eye Tracking Sensors Past, Present, Future and Their Applications
Abstract. The availability of eye tracking sensors is set to explode, with billions of units available in future Virtual Reality (VR) and Augmented Reality (AR) platforms. In my talk I will discuss the past and present status of eye tracking sensors, along with my vision for future development. I will also discuss applications that necessitate the presence of such sensors in VR/AR devices, along with applications that have the power to benefit society on a large scale when VR/AR solutions are widely adopted.
Bio. Dr. Komogortsev is currently a tenured Associate Professor at Texas State University. Dr. Komogortsev has received his B.S. in Applied Mathematics from Volgograd State University, Russia, and M.S./Ph.D. degree in Computer Science from Kent State University, Ohio. He has previously worked for such institutions as Johns Hopkins University, Notre Dame University, and Michigan State University. Dr. Komogortsev conducts research in eye tracking with a focus on cyber security (biometrics), human computer interaction, usability, bioengineering, and health assessment. This work has thus far yielded more than 100 peer reviewed publications and several patents. Dr. Komogortsev’s research was covered by the national media including NBC News, Discovery, Yahoo, Livesience and others. Dr. Komogortsev is a recipient of two Google Virtual Reality Research Awards and a Google Faculty Research Award. Dr. Komogortsev has also won National Science Foundation CAREER award and Presidential Early Career Award for Scientists and Engineers (PECASE) on the topic of cybersecurity with the emphasis on eye movement-driven biometrics and health assessment. In addition, his research is supported by the National Institute of Standards, Sigma Xi the Scientific Research Society, and various industrial sources. Dr. Komogortsev’s current grand vision is to push forward eye tracking solutions in the future virtual and augmented reality platforms as enablers of more immersive experiences, security, and assessment of human state.
From Gazing to Perceiving
Abstract. Eye tracking technology is based on the assumption that our perception follows the fovea – a tiny region in our retina responsible for sharp central vision. In fact, what we usually refer to as the line of sight is nothing but the imaginary line connecting the fovea to the gazed location. However, our visual perception is far more complex than that: Gazing is not perceiving. As a tangible example, consider our retinal peripheral view. Whereas we cannot distinguish details in this region, movements are perceptible nonetheless. In this talk, I will go beyond the line of sight simplification by a) exploring requirements needed to shift our paradigm from foveal to retina-aware eye tracking, and b) discussing novel ways to employ this new paradigm to further our understanding of human perception.
Bio. Enkelejda Kasneci is an Associate Professor of Computer Science at the University of Tübingen, Germany, where she leads the Perception Engineering Group. As a BOSCH-scholar, she received her M.Sc. degree in Computer Science from the University of Stuttgart in 2007. In 2013, she received her PhD in Computer Science from the University of Tübingen, Germany. For her PhD research, she was awarded the research prize of the Federation Südwestmetall in 2014. From 2013 to 2015, she was a Margarete-von-Wrangell Fellow. Dr. Kasneci’s overarching and long-term vision aims at computing systems that sense and infer the user’s cognitive state, actions, and intentions based on eye movements. These systems set out to provide information for assistive technologies applicable for many activities of everyday life. Towards this vision, her research combines eye tracking technology with machine learning in various multidisciplinary projects that are supported by national scientific societies as well as various industrial sources. In addition, she serves as academic for PlosOne as well as a reviewer and PC member for several journals and major conferences.