headerSearch form

Changing the World through Creative Research

Content-aware Eye Tracking for Autostereoscopic 3D Display

Journal
sensors
Date
2020.08.25
Abstract

This study develops an eye tracking method for autostereoscopic three-dimensional (3D) display systems for use in various environments. The eye-tracking-based autostereoscopic 3D display provides low crosstalk and high-resolution 3D image experience seamlessly without 3D eye-glasses by overcoming the viewing position restriction. However, accurate and fast eye position detection and tracking are still challenging owing to the various light conditions, camera control, thick eyeglasses, eyeglass sunlight reflection, and limited system resources. This study presents a robust, automated algorithm and relevant systems for accurate and fast detection and tracking of eye pupil centers in 3D with a single visual camera and near-infrared (NIR) light emitting diodes (LEDs). Our proposed eye tracker consists of eye?nose detection, eye?nose shape keypoint alignment, tracker-checker and tracking with NIR LED on/off control. Eye?nose detection generates facial subregion boxes, including eyes and nose, which utilize an Error Based Learning (EBL) method for the selection of the best learnt database (DB). After detection, the eye?nose shape alignment is processed by Supervised Descent Method (SDM) with Scale-invariant Feature Transform (SIFT). The aligner is content-aware in the sense that corresponding designated aligners are applied based on the image content classification, such as the various light conditions and wearing eyeglasses. The conducted experiments on real image DBs yield promising eye detection and tracking outcomes even in the presence of challenging conditions.

Reference
ang, D.; Heo, J. Content-Aware Eye Tracking for Autostereoscopic 3D Display. Sensors 2020, 20, 4787.
DOI
https://doi.org/10.3390/s20174787