New research has shown future wearable devices, such as smartwatches, could use ultrasound imaging to sense hand gestures.
The research team led by Professor Mike Fraser, Asier Marzo and Jess McIntosh from the Bristol Interaction Group (BIG) at the University of Bristol, together with University Hospitals Bristol NHS Foundation Trust (UH Bristol).
Hand gestures suggested as an intuitive and easy way of interacting with and controlling smart devices in different surroundings. For instance, a gesture used to dim the lights in the living room, or to open or close a window.
Hand gesture recognition achieved in many ways. But the placement of a sensor is a major restriction and often rules out certain techniques. However, with smartwatches becoming the leading wearable devices. This allows sensors to put in the watch to sense hand movement.
Image processing algorithms and Machine learning
The team propose ultrasonic imaging of the forearm used to recognize hand gestures. Ultrasonic imaging is already used in medicine. Such as pregnancy scans along with muscle and tendon movement. The researchers saw the potential for this used as a way of understanding hand movement. The image processing algorithms and machine learning to classify muscle movement as gestures. The researchers also carried out a user study to find the best sensor placement for this technique.
The team’s findings showed a very high recognition accuracy. This sensing method worked well at the wrist. It allows future wearable devices, such as smartwatches, to combine this ultrasonic technique to sense gestures.
Jess McIntosh, PhD student in the Department of Computer Science and BIG Group, said: “With current technologies, there are many practical issues that prevent a small, portable ultrasonic imaging sensor integrated into a smartwatch.
Research is a first step towards what could be the most accurate method for detecting hand gestures in smartwatches.”