Google’s Project Soli to Bring Radar-Based Gesture Recognition to Wearables

One of the big problems with wearable devices right now is inputs – there’s no simple way to control these devices. At Google I/O 2015 the company unveiled Project Soli – a radar-based wearable – that can be used to control all kinds of devices. Developed by Googles Advanced Technology and Projects (ATAP) team, Project Soli can be incorporated into a range of different devices.

It’s a gesture based system that can track small movements like waving your fingers – it could be an easy way to control wearables, or even give you have hands-off control of your phone. It could also allow you to enter text on a smartwatch without restricting you to the small screen.

Essentially, Project Soli is a radar system that’s small enough to fit into a smartwatch. It can pick up on movements in real time, and the movements you make alter its signal. It can detect swipes, or making a fist, or crossing fingers.

You can see the full explanation in the video below:

Using your hand to interact with a device is typically much more accurate than working with voice recognition – and according to this video, Project Soli is sensitive enough to track “micro-motions” – unlike something like Microsoft’s Kinect technology, which is not as precise. Unlike other systems that use cameras, Soli uses radar which has much higher sensitivity, so it could be used for gestures like pressing a button, moving a slider, or turning a knob.

We’re still in the early days of this technology, and it will take a couple of years before it is available commercially, but it could eventually replace voice commands as the best way of interacting with devices.

TAGS: , , , ,

Add a Comment

Your email address will not be published. Required fields are marked *