Estimating Gaze Depth Using Multi-Layer Perceptron

Date

2017

Authors

Lee, Y.
Shin, C.
Plopski, A.
Itoh, Y.
Piumsomboon, T.
Dey, A.
Lee, G.
Kim, S.
Billinghurst, M.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

Proceedings 2017 International Symposium on Ubiquitous Virtual Reality Isuvr 2017, 2017, iss.7988648, pp.26-29

Statement of Responsibility

Conference Name

2017 International Symposium on Ubiquitous Virtual Reality (ISUVR) (27 Jun 2017 - 29 Jun 2017 : Nara, Japan)

Abstract

In this paper we describe a new method for determining gaze depth in a head mounted eye-tracker. Eye-trackers are being incorporated into head mounted displays (HMDs), and eye-gaze is being used for interaction in Virtual and Augmented Reality. For some interaction methods, it is important to accurately measure the x-and y-direction of the eye-gaze and especially the focal depth information. Generally, eye tracking technology has a high accuracy in x-and y-directions, but not in depth. We used a binocular gaze tracker with two eye cameras, and the gaze vector was input to an MLP neural network for training and estimation. For the performance evaluation, data was obtained from 13 people gazing at fixed points at distances from 1m to 5m. The gaze classification into fixed distances produced an average classification error of nearly 10%, and an average error distance of 0.42m. This is sufficient for some Augmented Reality applications, but more research is needed to provide an estimate of a user's gaze moving in continuous space.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

Copyright 2017 IEEE

License

Grant ID

Call number

Persistent link to this record