Back
Image Alt

Google Unveils Project Soli, a Radar-Based Wearable for Everything

Google Unveils Project Soli, a Radar-Based Wearable for Everything

phpe9qcysDuring Google’s most recent annual developer conference, we saw some huge announcements from the company which included Android improvements, Daydream VR ecosystem, Android app support on Chrome OS, Allo and Duo, Google Home, Google Assistant, Android TV, Android Auto, and more.

The search engine giant however had more in store for the last day of I/O 2016 where Google’s ATAP (Advanced Technologies and Projects) took the stage and showed off improvements to its gesture tracking technology, Project Soli, and its touch-sensitive fabric technology, Project Jacquard.

To recall, Google had briefly touched upon Project Soli last year during the I/O conference though the company made no further announcement about the project since last year. The company showed off the new Soli chip which incorporates the sensor and antenna array into an ultra-compact 8x10mm package.

Soli is a sensing technology that uses miniature radar to detect touchless gesture interactions, and its sensor can track sub-millimeter motion at high speeds with great accuracy. The Soli sensor technology works by emitting electromagnetic waves in a broad beam. When objects within the beam scatter the energy, it reflects some portion back towards the radar antenna.

“Imagine an invisible button between your thumb and index fingers – you can press it by tapping your fingers together. Or a Virtual Dial that you turn by rubbing thumb against index finger. Imagine grabbing and pulling a Virtual Slider in thin air. These are the kinds of interactions we are developing and imagining,” details Google’s Project Soli page.

Basically, project Soli is a new way of touchless interactions, and it was designed to prove that we can embed tiny radar chips into electronics so that we can use minute hand gestures to control the digital world around us. Talking about some of the devices where Project Soli can be embedded include wearables, phones, computers, cars and IoT devices.

During the conference, the company actually showed a modified LG Watch Urbane that was barely bigger than a standard one, with radar built-in. The demo showed how the watch would react when you moved your hand close to the radar sensor, and allow you to scroll through a message by flicking your fingers inches away from the screen. Obviously this frees up the screen to show more content, but it also allows very precise control on a device that is otherwise pretty hard to get around.

Another example is the Soli-enabled speaker that the company showed. Soli can work from meters away, which would mean a snap of a finger or the flick of a wrist could control my Bluetooth speakers from across the room.

It’s still early days for Project Soli, but I think it’s going to be a game-changer. Rather than go hands-free, Project Soli makes your hands the user interface, which may already  be more interesting than voice control ever was.

Photo source: https://atap.google.com/soli/

Comment(1)

  • tim carrico

    June 2, 2016

    Good article Rick. The tactile connection to the VR world will be a new frontier. The challenge remains to connect these technologies to practical, useful applications beyond games and novelties. I believe it will happen, but will require the just starting revolution in AI to supplement the impressive UX/UI emerging.

Post a Comment