Generalizing Gaze Estimation with Outlier-guided Collaborative Adaptation

Teaser Image

Abstract

Deep neural networks have significantly improved appearance-based gaze estimation accuracy. However, it still suffers from unsatisfactory performance when gener- alizing the trained model to new domains, e.g., unseen en- vironments or persons. In this paper, we propose a plug- and-play gaze adaptation framework (PnP-GA), which is an ensemble of networks that learn collaboratively with the guidance of outliers. Since our proposed framework does not require ground-truth labels in the target domain, the existing gaze estimation networks can be directly plugged into PnP-GA and generalize the algorithms to new do- mains. We test PnP-GA on four gaze domain adaptation tasks, ETH-to-MPII, ETH-to-EyeDiap, Gaze360-to-MPII, and Gaze360-to-EyeDiap. The experimental results demon- strate that the PnP-GA framework achieves considerable performance improvements of 36.9%, 31.6%, 19.4%, and 11.8% over the baseline system. The proposed framework also outperforms the state-of-the-art domain adaptation ap- proaches on gaze domain adaptation tasks.

Publication
IEEE/CVF International Conference on Computer Vision