TY - GEN
T1 - Robust gaze tracking method for stereoscopic virtual reality systems
AU - Lee, Eui Chul
AU - Park, Kang Ryoung
AU - Whang, Min Cheol
AU - Park, Junseok
PY - 2007
Y1 - 2007
N2 - In this paper, we propose a new face and eye gaze tracking method that works by attaching gaze tracking devices to stereoscopic shutter glasses. This paper presents six advantages over previous works. First, through using the proposed method with stereoscopic VR systems, users feel more immersed and comfortable. Second, by capturing reflected eye images with a hot mirror, we were able to increase eye gaze accuracy in a vertical direction. Third, by attaching the infrared passing filter and using an IR illuminator, we were able to obtain robust gaze tracking performance irrespective of environmental lighting conditions. Fourth, we used a simple 2D-based eye gaze estimation method based on the detected pupil center and the 'geometric transform' process. Fifth, to prevent gaze positions from being unintentionally moved by natural eye blinking, we discriminated between different kinds of eye blinking by measuring pupil sizes. This information was also used for button clicking or mode toggling. Sixth, the final gaze position was calculated by the vector summation of face and eye gaze positions and allowing for natural face and eye movements. Experimental results showed that the face and eye gaze estimation error was less than one degree.
AB - In this paper, we propose a new face and eye gaze tracking method that works by attaching gaze tracking devices to stereoscopic shutter glasses. This paper presents six advantages over previous works. First, through using the proposed method with stereoscopic VR systems, users feel more immersed and comfortable. Second, by capturing reflected eye images with a hot mirror, we were able to increase eye gaze accuracy in a vertical direction. Third, by attaching the infrared passing filter and using an IR illuminator, we were able to obtain robust gaze tracking performance irrespective of environmental lighting conditions. Fourth, we used a simple 2D-based eye gaze estimation method based on the detected pupil center and the 'geometric transform' process. Fifth, to prevent gaze positions from being unintentionally moved by natural eye blinking, we discriminated between different kinds of eye blinking by measuring pupil sizes. This information was also used for button clicking or mode toggling. Sixth, the final gaze position was calculated by the vector summation of face and eye gaze positions and allowing for natural face and eye movements. Experimental results showed that the face and eye gaze estimation error was less than one degree.
UR - http://www.scopus.com/inward/record.url?scp=38149030983&partnerID=8YFLogxK
U2 - 10.1007/978-3-540-73110-8_76
DO - 10.1007/978-3-540-73110-8_76
M3 - Conference contribution
AN - SCOPUS:38149030983
SN - 9783540731085
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 700
EP - 709
BT - Human-Computer Interaction
PB - Springer Verlag
T2 - 12th International Conference on Human-Computer Interaction, HCI International 2007
Y2 - 22 July 2007 through 27 July 2007
ER -