Google unveils Project Soli: gesture control like you’ve never seen before
Posted On September 14, 2015
Google’s ATAP – the company’s skunkworks division responsible for implementing unusual ideas and creating new tech – caught everyone’s attention at the Google I/O conference. They introduced Project Soli: a radar-based technology that allows you to control your devices with gestures, without touching them.
You’ve probably noticed that the smaller the screen of a device, the harder it is to use touch commands. If you touch the screen of a smartwatch, for example, your finger will block much of the visual information. Project Soli could be the solution to this problem. The idea consists in a tiny chip that incorporates radar-based sensors that emit signals that can detect volume, distance and motion. This means that the sensor can also recognize gestures.
According to project leader Ivan Poupyrev, Project Soli is being developed to recognize movements that are already commonly used in regular technology, making the use of this technology quite intuitive. If you need, for example, to adjust the time, you mimic the same movements you perform to turn the knob of a conventional wristwatch by rubbing your index finger against your thumb. Project Soli is able to detect the gesture and perform the corresponding action as if, indeed, you were turning the knob of the watch.
This technology is interesting because in addition to freeing the user from touching the screens, it can be useful in wearable devices that do not have a display, such as smart glasses or bracelets. Because the chip is very small – and eventually may become smaller – its implementation in wearables of all kinds should not be a challenge.
The bad part is that, unfortunately, the technology is not yet ready to leave the ATAP labs. Google should release more information about Project Soli as the research progresses.