An evaluation framework for gaze based HCI systems
Eye-gaze estimation in unconstrained conditions has emerged as a potential field of research in the last two decades. A wide variety of algorithms and setup configurations have been developed-with the latest ones relying on passive video based eye tracking techniques . In these, an eye- tracker is placed at a certain distance from the user, and gaze estimation is done by capturing and processing images of the full face or eye region in natural light or using active illumination from near infrared (NIR) LEDs. Two commonly used algorithms for video based gaze estimation are the Pupil Center Corneal Reflection (PCCR) based methods and the Appearance-model-based methods . The former calculates the vector between the reflections from NIR LEDs on the cornea and pupil center –which is then tracked and mapped geometrically to the user gaze coordinates on a screen (Fig 1a). The appearance based methods use the shape and texture information of the eye regions to find the point of gaze (Fig.1b).
Eye gaze information has found wide range of applicability as an input modality to several human computer interaction (HCI) platforms as shown below . Gaze estimation methods in such widely different applications operate over a broad range of environmental conditions and as a result their performance is affected by several error sources-which are common or often unique to each platform.
The problem in the domain of eye gaze research is that currently there are no unified standard schemes to evaluate the performance of EGE algorithms or setups across various platforms. Only a few literatures report the impact of system meta-parameters and there is no agreement between the metrics used for defining gaze accuracy. Therefore there is no concrete way to state with certainty if the results of a recent design would perform better than conventional ones and under what conditions certain research claims would be valid.
We are addressing the issue of standardization in gaze based HCI systems through investigating the diverse nature of present day EGE research and outcomes. The contemporary methods of gaze tracking and their applications in modern consumer devices are studied in order to assess gaze estimation performance parameters in different platforms. Ultimately we are working towards the development of an evaluation framework that can be used to estimate the performance, define design criteria and benchmark quality measures of gaze based systems for the EGE research and user community.
Eye tracking experiments are conducted using two high resolution eye trackers with angular accuracies of 0.5 degree. Eye tracking data is collected from users who are asked to gaze at fixed or moving targets points on the screen from which the gaze error in estimated in degrees. Currently we are in the process of designing more detailed experiments to gain insight into accuracy measures for different platforms, users and tasks.
. P. M. Corcoran, F. Nanu, S. Petrescu, and P. Bigioi, “Real-time eye gaze tracking for gaming design and consumer electronics systems,” IEEE Trans Consum. Electron., vol. 58, no. 2, pp. 347–355, 2012.
. D. W. Hansen and Q. Ji, “In the eye of the beholder: A survey of models for eyes and gaze,” IEEE Trans. Pattern Anal. Mach. Intell.,vol. 32, no. 3, pp. 478–500, 2010.
. R.I Hammoud, R. I. (Ed): Passive Eye Monitoring: Algorithms, Applications and Experiments, Springer, ISBN 9783540754114, pp. 373-386, 2008.
- A. Kar, P. Corcoran, “Towards the development of a standardized performance evaluation framework for eye gaze estimation systems in consumer platforms”, Accepted for 2016 IEEE International Conference on Systems, Man, and Cybernetics, Oct 9-12, Budapest.
- A. Kar, S. Bazrafkan, C. Costache and P. Corcoran, “Eye-gaze systems: An analysis of error sources and potential accuracy in consumer electronics use cases,” 2016 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, 2016, pp. 319-320.
- S. Bazrafkan, A. Kar and C. Costache, “Eye Gaze for Consumer Electronics: Controlling and commanding intelligent systems.,” in IEEE Consumer Electronics Magazine, vol. 4, no. 4, pp. 65-71, Oct. 2015.