Oculus | Eye tracking based on light polarization
Publication Number: 10108261
Publication Date: 2018-10-23
Applicants: Oculus VR, LLC
A head mounted display (HMD) comprises an eye tracking system configured to enable eye-tracking using polarization. The eye tracking system includes an illumination source and an eye tracking unit comprising a polarization sensitive optical detector. The one or more illumination sources are configured to illuminate an eye and generate reflections directed towards the optical detector. The eye tracking unit is configured to determine a 3D shape of the eye based on the polarization of the reflections. The determined 3D shape of the eye is used to update a stored model of the eye in response to the one or more model parameter values extracted from the determined depth map of the corneal surface. The eye tracking system determines eye tracking information based on the updated model in order to improve eye tracking performance.
The present disclosure generally relates to eye tracking, and specifically to eye tracking based on light polarization.
Eye tracking is an important feature for head-mounted display (HMD) systems including systems used in virtual reality (VR) applications. Conventional tracking systems track features of the human eye and are typically limited by the quality of the optical path. These conventional systems do not provide sufficient accuracy needed for eye tracking in a HMD system.
A head mounted display (HMD) comprises an eye tracking system that tracks a user of the HMD’s eye(s) based in part on polarization of light reflected from the eye(s). The eye tracking system can be used in a virtual reality (VR) system environment or other system environments, such as an augmented reality (AR) system. The eye tracking system includes one or more illumination sources configured to illuminate a user’s eye using light and an optical detector to capture polarized light reflecting from the user’s cornea. The optical detector and the one or more illumination sources are positioned relative to each other such that the optical detector is able to capture light emitted by the one or more illumination sources and reflected from the user’s eye (hereinafter referred to as “eye reflections”). In an embodiment, the optical detector comprises an array of polarization sensitive pixels and is able to determine the polarization state (e.g., polarization angle) of the eye reflections. The determined polarization state of the eye reflections at each pixel of the optical detector is used to determine a 3D shape of the eye. In various embodiments, the system additionally determines a new 3D shape of the eye as the user moves their eyes (e.g., as a user gazes at different objects on a display during normal operation). The system retrieves a stored model of the eye comprising two or more spheres wherein the spheres represent the overall eye and the cornea. In various embodiments, the retrieved model is parameterized by a radius and an origin of each of the spheres comprising the model. In an embodiment, the system extracts one or more parameters values from the generated depth map. Responsive to a determining a difference in at least one of the one or more extracted model parameter values and the one or more parameters associated with the retrieved model, the one or more parameters associated with the retrieved model are updated. The system uses the eye model to determine various types of eye tracking information such as a user’s gaze direction, vergence angle/depth, and accommodation depth.