Free-hand gesture interfaces for an augmented exhibition podium

Date

2015

Authors

Bai, H.
Lee, G.
Billinghurst, M.

Editors

Carter, M.

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

Ozchi 2015 Being Human Conference Proceedings, 2015 / Carter, M. (ed./s), pp.182-186

Statement of Responsibility

Conference Name

OzCHI '15: The Annual Meeting of the Australian Special Interest Group for Computer Human Interaction (7 Dec 2015 - 10 Dec 2015 : Melbourne, Adelaide)

Abstract

In this paper we present an augmented exhibition podium that supports natural free-hand 3D interaction for visitors using their own mobile phones or Smart Glasses. Visitors can point the camera of their mobile phones or Smart Glasses at the podium to see Augmented Reality (AR) content overlaid on a physical exhibit, and can also use their free-hand gestures to interact with the AR content. For instance, they can use pinching gestures to select different parts of the exhibit with their fingers to view augmented text descriptions, instead of touching the mobile phone screen. The prototype combines vision based image tracking and free-hand gesture detection via a depth camera in a client-server framework, which enables users to use their hands with the augmented exhibition without requiring special hardware (e.g. a depth sensor) on their personal devices. Results from ourpilot user study shows that the prototype system is as intuitive to use as a traditional touch-based interface, and provides a more fun and engaging experience.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

Copyright 2015 ACM

License

Grant ID

Call number

Persistent link to this record