Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/110474
Citations
Scopus Web of Science® Altmetric
?
?
Full metadata record
DC FieldValueLanguage
dc.contributor.authorRuan, W.-
dc.contributor.authorSheng, Q.-
dc.contributor.authorYang, L.-
dc.contributor.authorGu, T.-
dc.contributor.authorXu, P.-
dc.contributor.authorShangguan, L.-
dc.date.issued2016-
dc.identifier.citationUbiComp 2016: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 2016, pp.474-485-
dc.identifier.isbn9781450344616-
dc.identifier.urihttp://hdl.handle.net/2440/110474-
dc.description.abstractHand gesture is becoming an increasingly popular means of interacting with consumer electronic devices, such as mobile phones, tablets and laptops. In this paper, we present AudioGest, a device-free gesture recognition system that can accurately sense the hand in-air movement around user's devices. Compared to the state-of-the-art, AudioGest is superior in using only one pair of built-in speaker and microphone, without any extra hardware or infrastructure support and with no training, to achieve fine-grained hand detection. Our system is able to accurately recognize various hand gestures, estimate the hand in-air time, as well as average moving speed and waving range. We achieve this by transforming the device into an active sonar system that transmits inaudible audio signal and decodes the echoes of hand at its microphone. We address various challenges including cleaning the noisy reflected sound signal, interpreting the echo spectrogram into hand gestures, decoding the Doppler frequency shifts into the hand waving speed and range, as well as being robust to the environmental motion and signal drifting. We implement the proof-of-concept prototype in three different electronic devices and extensively evaluate the system in four real-world scenarios using 3,900 hand gestures that collected by five users for more than two weeks. Our results show that AudioGest can detect six hand gestures with an accuracy up to 96%, and by distinguishing the gesture attributions, it can provide up to 162 control commands for various applications.-
dc.language.isoen-
dc.publisherACM-
dc.rights© 2016 ACM-
dc.source.urihttp://dx.doi.org/10.1145/2971648.2971736-
dc.subjectHand gestures; audio; microphone; FFT; Doppler Effect-
dc.titleAudioGest: enabling fine-grained hand gesture detection by decoding echo signal-
dc.typeConference paper-
dc.contributor.conference2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2016) (12 Sep 2016 - 16 Sep 2016 : Heidelberg)-
dc.identifier.doi10.1145/2971648.2971736-
dc.publisher.placeNew York, USA-
pubs.publication-statusPublished-
Appears in Collections:Aurora harvest 8
Computer Science publications

Files in This Item:
File Description SizeFormat 
RA_hdl_110474.pdf
  Restricted Access
Restricted Access2.53 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.