RemoteFusion: real time depth camera fusion for remote collaboration on physical tasks

Date

2013

Authors

Adcock, M.
Anderson, S.
Thomas, B.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

VRCAI '13 Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry, 2013, pp.235-242

Statement of Responsibility

Conference Name

12th ACM International Conference on Virtual Reality Continuum and Its Applications in Industry (17 Nov 2013 - 19 Nov 2013 : Hong Kong)

Abstract

Remote guidance systems allow humans to collaborate on physical tasks across large distances and have applications in fields such as medicine, maintenance and working with hazardous substances. Existing systems typically provide two dimensional video streams to remote participants, and these are restricted to viewpoint locations based on the placement of physical cameras. Recent systems have incorporated the ability of a remote expert to annotate their 2D view and for these annotations to be displayed in the physical workspace to the local worker. We present a prototype remote guidance system, called RemoteFusion, which is based on the volumetric fusion of commodity depth cameras. The system incorporates real-time 3D fusion with color, the ability to distinguish and render dynamic elements of a scene whether human or non-human, a multi-touch driven free 3D viewpoint, and a Spatial Augmented Reality (SAR) light annotation mechanism. We provide a physical overview of the system, including hardware and software configuration, and detail the implementation of each of the key features.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

Copyright 2013 ACM

License

Grant ID

Call number

Persistent link to this record