SPECIAL SESSION #24
Exploring Challenges and Opportunities to Detect Fine-Grained Finger Microgestures in Everyday Contexts
ORGANIZED BY
Dr. Shelly Vishwakarma
University of Southampton, UK
Dr. Adwait Sharma
University of Bath, UK
ABSTRACT
Recent research has unveiled the potential of subtle finger microgestures, such as taps or swipes, as intuitive alternatives to traditional wake words in virtual assistant platforms like "Alexa" or "Hey Google." These microgestures not only offer versatility across diverse applications but also promise hands-free operation in critical contexts like healthcare. However, effectively detecting these fine-grained gestures presents a multifaceted challenge, demanding a delicate balance between precision and user comfort.
In this workshop, we aim to delve into techniques to improve gesture recognition systems while prioritizing user experience. Through discussions and demonstrations, we will explore advances in recognition algorithms, showcase practical implementations of gesture integration into daily routines, and underscore the resultant enhancements in user interactions and comfort.
ABOUT THE ORGANIZERS
Dr. Shelly Vishwakarma, is a Lecturer in Digital Health & Biomedical Engineering Group. Her current research focuses on designing and developing hardware and software frameworks to advance state-of-the-art opportunistic sensing using radio frequency (RF) signals from WiFi transmissions for contextual sensing applications, including concurrent physical activity recognition and indoor localization. She received her PhD from the Indraprastha Institute of Information Technology, Delhi, India, in 2020, where her research investigated advanced signal processing techniques for human activity detection, classification, and imaging in indoor environments. Before joining ECS, she worked as a Research Fellow on an EPSRC-funded project, OPERA Opportunistic Passive RADAR for Non-Cooperative Contextual Sensing, at University College London. The OPERA project investigates a novel unobtrusive RADAR sensing technology for contextual sensing to facilitate healthcare and Ambient Assisted Living. During the global pandemic, she developed an animation data-driven human radio frequency (RF) scattering simulator, SimHumalator, to generate realistic RADAR signatures associated with activities relevant to healthcare, including sitting and standing to fall over. The simulator has been used across the globe to alleviate the well-known 'cold-start problem in RADAR, where there is a lack of useable data for training machine learning networks (https://uwsl.co.uk/). Dr Vishwakarma has won the best student paper award at IEEE International RADAR Conference, Atlanta, USA, 2020, and nomination for the best paper award at IEEE International RADAR Conference, Toulon, France, 2019. More recently, she won the second and third best paper awards in IEEE Radar Challenge, New York, 2022, for her work on developing an ML-assisted radar signal processing framework and building a hardware prototype using commercially available off-the-shelf components.
Dr. Adwait Sharma, is a Lecturer in Computer Science at the University of Bath. His research lies at the intersection of Human-Computer Interaction and Machine Learning, with a focus on developing novel methods to enable always-available input. His work leverages interaction design, large-scale datasets, and machine learning techniques for real-time recognition. Adwait holds a Ph.D. from Saarland University (Max Planck Institute) in Germany. He has served on the program committees for ACM CHI and UIST. Previously, he worked at various other prominent HCI groups worldwide, including National University of Singapore, Media Interaction Lab in Austria, and Meta Reality Labs in Toronto. Find more about at: https://adwaitsharma.com.