Push the Limit of Acoustic Gesture Recognition

Yanwen WANG, Jiaxing SHEN, Yuanqing ZHENG

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

76 Citations (Scopus)

Abstract

With the flourish of the smart devices and their applications, controlling devices using gestures has attracted increasing attention for ubiquitous sensing and interaction. Recent works use acoustic signals to track hand movement and recognize gestures. However, they suffer from low robustness due to frequency selective fading, interference and insufficient training data. In this work, we propose RobuCIR, a robust contact-free gesture recognition system that can work under different usage scenarios with high accuracy and robustness. RobuCIR adopts frequency-hopping mechanism to mitigate frequency selective fading and avoid signal interference. To further increase system robustness, we investigate a series of data augmentation techniques based on a small volume of collected data to emulate different usage scenarios. The augmented data is used to effectively train neural network models and cope with various influential factors (e.g., gesture speed, distance to transceiver, etc.). Our experiment results show that RobuCIR can recognize 15 gestures and outperform state-of-the-art works in terms of accuracy and robustness.

Original languageEnglish
Title of host publicationINFOCOM 2020 - IEEE Conference on Computer Communications
PublisherIEEE
Pages566-575
Number of pages10
ISBN (Electronic)9781728164120
DOIs
Publication statusPublished - 2020
Externally publishedYes
Event38th IEEE Conference on Computer Communications, INFOCOM 2020 - Toronto, Canada
Duration: 6 Jul 20209 Jul 2020

Publication series

NameIEEE Annual Joint Conference: INFOCOM, IEEE Computer and Communications Societies

Conference

Conference38th IEEE Conference on Computer Communications, INFOCOM 2020
Country/TerritoryCanada
CityToronto
Period6/07/209/07/20

Funding

This work is supported in part by the National Nature Science Foundation of China under grant 61702437 and Hong Kong GRF under grant PolyU 152165/19E. Yuanqing Zheng is the corresponding author.

Fingerprint

Dive into the research topics of 'Push the Limit of Acoustic Gesture Recognition'. Together they form a unique fingerprint.

Cite this