Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Gaze meets ML

Appearance-Based Gaze Estimation for Driver Monitoring

Soodeh Nikan · devesh upadhyay

Keywords: [ Driver state monitoring ] [ Gaze angle ] [ CNN-based gaze estimation ] [ L3 autonomy ] [ Takeover request ] [ Driver inattention ]


Abstract:

Driver inattention is a leading cause of road accidents through its impact on reaction time in the face of incidents. In the case of Level-3 (L3) vehicles, inattention adversely impacts the quality of driver take over and therefore the safe performance of L3 vehicles. There is a high correlation between a driver’s visual attention and eye movement. Gaze angle is an excellent surrogate for assessing driver attention zones, in both cabin interior and on-road scenarios. We propose appearance-based gaze estimation approaches using convolutional neural networks (CNNs) to estimate gaze angle directly from eye images and also from eye landmark coordinates. The goal is to improve learning by utilizing synthetic data with more accurate annotations. Performance analysis shows that our proposed landmark-based model, trained synthetically, is capable of predicting gaze angle in the real data with a reasonable angular error. In addition, we discuss evaluation metrics are application specific and there is a crucial requirement for a more reliable assessment metric rather than common mean angular error to measure the driver's gaze direction in L3 autonomy for a control takeover request at a proper time corresponding to the driver's attention focus to avoid ambiguities.

Chat is not available.