Tangible UI by object and material classification with radar

Date

2017

Authors

Yeo, H.S.
Ens, B.
Quigley, A.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

SIGGRAPH Asia 2017 Emerging Technologies, SA 2017, 2017, iss.14, pp.1-2

Statement of Responsibility

Conference Name

SIGGRAPH Asia 2017 Emerging Technologies, SA 2017 (27 Nov 2017 - 30 Nov 2017 : Bangkok, Thailand)

Abstract

Radar signals penetrate, scaffer, absorb and reflect energy into proximate objects and ground penetrating and aerial radar systems are well established. We describe a highly accurate system based on a combination of a monostatic radar (Google Soli), supervised machine learning to support object and material classification based UIs. Based on RadarCat techniques, we explore the development of tangible user interfaces without modification of the objects or complex infrastructures. This affords new forms of interaction with digital devices, proximate objects and micro-gestures.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

Copyright 2017 held by the owner/author(s).

License

Grant ID

Call number

Persistent link to this record