Sep 22 2014
With almost all of the U.S. population armed with cellphones – and close to 80 percent carrying a smartphone – mobile phones have become second-nature for most people.
What’s coming next, say University of Washington researchers, is the ability to interact with our devices not just with touchscreens, but through gestures in the space around the phone. Some smartphones are starting to incorporate 3-D gesture sensing based on cameras, for example, but cameras consume significant battery power and require a clear view of the user’s hands.
UW engineers have developed a new form of low-power wireless sensing technology that could soon contribute to this growing field by letting users “train” their smartphones to recognize and respond to specific hand gestures near the phone.
The technology – developed in the labs of Matt Reynolds and Shwetak Patel, UW associate professors of electrical engineering and of computer science and engineering – uses the phone’s wireless transmissions to sense nearby gestures, so it works when a device is out of sight in a pocket or bag and could easily be built into future smartphones and tablets.
“Today’s smartphones have many different sensors built in, ranging from cameras to accelerometers and gyroscopes that can track the motion of the phone itself,” Reynolds said. “We have developed a new type of sensor that uses the reflection of the phone’s own wireless transmissions to sense nearby gestures, enabling users to interact with their phones even when they are not holding the phone, looking at the display or touching the screen.”
SideSwipe
Team members will present their project, called SideSwipe, and a related paper Oct. 8 at the Association for Computing Machinery’s Symposium on User Interface Software and Technology in Honolulu.
When a person makes a call or an app exchanges data with the Internet, a phone transmits radio signals on a 2G, 3G or 4G cellular network to communicate with a cellular base station. When a user’s hand moves through space near the phone, the user’s body reflects some of the transmitted signal back toward the phone.
The new system uses multiple small antennas to capture the changes in the reflected signal and classify the changes to detect the type of gesture performed. In this way, tapping, hovering and sliding gestures could correspond to various commands for the phone, such as silencing a ring, changing which song is playing or muting the speakerphone. Because the phone’s wireless transmissions pass easily through the fabric of clothing or a handbag, the system works even when the phone is stowed away.
“This approach allows us to make the entire space around the phone an interaction space, going beyond a typical touchscreen interface,” Patel said. “You can interact with the phone without even seeing the display by using gestures in the 3-D space around the phone.”
A group of 10 study participants tested the technology by performing 14 different hand gestures – including hovering, sliding and tapping – in various positions around a smartphone. Each time, the phone was calibrated by learning a user’s hand movements, then trained itself to respond. The team found the smartphone recognized gestures with about 87 percent accuracy.
There are other gesture-based technologies, such as “AllSee” and “WiSee” recently developed at the UW, but researchers say there are important advantages to the new approach.
“SideSwipe’s directional antenna approach makes interaction with the phone completely self-contained, because you’re not depending on anything in the environment other than the phone’s own transmissions,” Reynolds said. “Because the SideSwipe sensor is based only on low-power receivers and relatively simple signal processing compared with video from a camera, we expect SideSwipe would have a minimal impact on battery life.”
The team has filed patents on the technology and will continue developing SideSwipe, integrating the hardware and making a “plug and play” device that could be built into smartphones, said Chen Zhao, project lead and a UW doctoral student in electrical engineering.
Other co-authors are Ke-Yu Chen, a UW doctoral student in electrical engineering, and Md Tanvir Islam Aumi, a doctoral student in computer science and engineering.
This research was funded by the UW.