CreativeReview

A look inside Google's Project Soli

Google's Project Soli, which envisions a future where we will be able to control and manipulate wearables and other tech with simple hand gestures, has been causing excitement in the tech industry and beyond since it was revealed at the recent Google I/O conference. Here AllofUs's Orlando Mathias, who has worked on the project, talks about how it might change our lives...

Last year, writes Orlando Mathias, ATAP (Advanced Technology and Projects), Google's maverick team of engineers, asked us to partner with them on a revolutionary new piece of technology.

We were brought in to work with them on Project Soli. Much of what we did in partnership with Google was and remains top secret. Nonetheless, it was difficult to contain our excitement and enthusiasm once the technology was unveiled to us. Soli uses the unique capabilities of radar technology to track sub-millimetre gestures at high speeds and accuracy, with the added features of gathering this data and remaining concealed within a range of different materials.

Soli has the potential to connect our bodies with the virtual and unify our movements with the physical environment. The industry’s excitement over the announcement of Soli at Google I/O shows that there is a real desire for a new and practical approach to the way we interact with the virtual world.

 

 

We at AllofUs have been discovering how people can use their bodies to engage and explore with technology in the physical world since the late 90s. Much has changed since those early pioneering days. Our eyes and fingers are now shackled to small portable screens that feed our addiction to the new and extremely demanding connected world. We reportedly check our phones on average 1,500 times a week, causing us to drop what we’re doing or shift our attention away from our current activity. Soon wearables and the Internet of Things will be commonplace and we will have even more digital mechanisms embedded into our physical environments to learn from and interact with.

We have seen camera-based gesture recognition with the likes of Microsoft’s Kinect and Samsung's Smart TV. But problems have arisen from their reliance on infrared, the limited fidelity of motion capture and the lack of a concrete solution to haptic feedback. There are also privacy concerns over hacking.

Radar technology could be the solution we’ve been waiting for. Radar has properties that no other technology shares. It is extremely reliable and robust and it has no moving parts and no lenses, so there is essentially nothing to break. Google has shrunk this down to a tiny scale, allowing it to be embedded into anything imaginable.

A gesture detection device this small and this accurate without the privacy issues of camera-based technologies frees us from many of the limitations that come with designing digital products. Suddenly, we have a way of interacting that can become invisible. It can be in your pocket, embedded in the arm of your sofa, hidden in the dashboard of your car, in the seat of your bicycle or even mounted in the walls of your house.

 

 

The high fidelity nature of Soli, which has the ability to sense the tiniest motion, could solve the haptic deficiencies of gestural recognition. Carsten Schwesig, the design lead for Project Soli, explains it thus: “The hand can both embody the virtual tool and it can also be acting on that virtual tool at the same time." The fact that radar technology can inhabit such a wide range of places could give rise to a standard language for gestural interaction. Wouldn’t it be wonderful to have a set of simple standard gestures that can be applied to all your needs?

This simple breakthrough could be the thing that unshackles our hands and eyes from the screen. Using your own body as a tool for interacting and the method of producing haptic feedback creates a new type of input that is fast, intuitive and extremely versatile.

Most interactions with technology fall into two basic categories – a click and a slide. For your hands this translates as a press and a stroke. Using radar technology, we can now simply tap two fingers together for a click, and slide one finger along another for a scroll.

These simple interactions can be used for almost all aspects of daily life. Take for instance the array of buttons and controls you need to negotiate in the car whilst also concentrating on the road. You have to reach across to adjust the temperature or the music volume or find the right switch to put the windows up or down, and even adjust the fiddly levers to alter your wing mirrors. Even touch screen controls in cars demand considerable attention, forcing us avert our eyes from the road. These dials, screens, knobs and buttons require you to take a hand off the wheel, fumble around to find them and then know how to use them. Simply pressing or rubbing two fingers together could circumvent the need for this confusing and complex array of tools.

Project Soli may not be the complete package just yet and radar technology has its own limitations, but these investigations into simplifying our relationship with technology and re-engaging with our physical world is an exciting and necessary endeavour. With the promise from Google to release an SDK to the developer community we could soon be a step closer to freeing our hands and eyes from the draw of the screen.

allofus.com