EyemR-Talk: Using Speech to Visualise Shared MR Gaze Cues

Date

2021

Authors

Jing, A.
Matthews, B.
May, K.
Clarke, T.
Lee, G.
Billinghurst, M.

Editors

Shiota, S.J.
Kimura, A.
Ma, W.-C.A.

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

Proceedings SIGGRAPH Asia 2021 Posters SA 2021, 2021 / Shiota, S.J., Kimura, A., Ma, W.-C.A. (ed./s), pp.1-2

Statement of Responsibility

Conference Name

SA '21: SIGGRAPH Asia 2021 (14 Dec 2021 - 17 Dec 2021 : Tokyo, Japan)

Abstract

In this poster we present eyemR-Talk, a Mixed Reality (MR) collaboration system that uses speech input to trigger shared gaze visualisations between remote users. The system uses 360° panoramic video to support collaboration between a local user in the real world in an Augmented Reality (AR) view and a remote collaborator in Virtual Reality (VR). Using specific speech phrases to turn on virtual gaze visualisations, the system enables contextual speech-gaze interaction between collaborators. The overall benefit is to achieve more natural gaze awareness, leading to better communication and more effective collaboration.

School/Discipline

Dissertation Note

Provenance

Description

Data source: Supplementary material, https://doi.org/10.1145/3476124.3488618

Access Status

Rights

Copyright 2021 held by the owner/author(s).

License

Grant ID

Call number

Persistent link to this record