At Mobile World Congress, I experienced Elliptic Labs’ new way to control devices with just your gestures using ultrasound—which you may know better from medical devices, but here it’s obviously different. Elliptic’s technology uses a device’s speaker to create an ultrasonic wave, which hits your hand to create an echo, which is in turn picked up by the device’s microphone to register the action.
Existing devices, such as smartphones or smoke alarms, may use infrared to recognize gestures, but that technology has limitations in viewing range and sensitivity in varying light conditions. Ultrasound doesn’t react to light, so the range of Elliptic’s system is up to seven feet.
The prototype I tested out consisted of an added speaker and microphone attached to an existing tablet, and I was able to control a game and the tablet’s lock screen by gently waving over the devices. A specially-designed lamp with an integrated speaker and mic turned on and off easily with a double-wave gesture.
Elliptic’s technology operates in the 25kHz to 50kHz frequency range. This is a level that your ears, or your pets’ ears, would not be able to pick up, but a bat could possibly hear it.
There’s something relaxing about waving at devices as if they were friends, rather than jabbing at screens. The technology is new, however, and a few modifications would have to be made if you had multiple ultrasound-enabled devices near each other. The devices would work, but you’d need to adjust the frequency of the wave each detected so they wouldn’t all react to the same gestures.
The benefit to device makers is cost: They can remove the infrared camera or other detection device they had before. Elliptic Labs says this technology will appear in an upcoming smartphone due to launch this year, and in other devices at future dates.
Sign up for CIO Asia eNewsletters.