The aim is to replace all of the physical commands needed to control a wearable with hand gestures, said Poupyrev. Radar can capture subtle 3D movements and work either during the day or at night, among other sensor criteria, he said. For Project Soli, Google and its partners shrunk the radar functionality down to a chip that can fit inside a wearable.
The gesture radar can distinguish distance as well as gestures. During a demonstration, Poupyrev changed the time on a digital watch face by holding his hand at different heights. Holding his hands close to the chip and gesturing like he was turning a dial changed the hours while holding his hand higher up changed the minutes.
Google is building an API (application programming interface) for the gesture radar and will release it later this year.
Sign up for CIO Asia eNewsletters.