Google's radar-based gesture sensor given the go-ahead

Tiny chips use radar signals to create virtual controls for devices

Google has been given the green light by the FCC to push forward with a radar-based sensor that can recognise hand gestures, with the technology being pegged as a new way to control smartphones and IoT devices. 

Project Soli, which is a form of sensor technology that works by emitting electromagnetic waves in a broad beam, was initially blocked due to concerns it would disrupt existing technology.

Radar beam interpreting hand gestures - courtesy of Google

Objects within the beam scatter energy, reflecting some portion back towards a radar antenna. The signals reflected back capture information about its characteristics and dynamics, including size, shape, orientation, material, distance and velocity.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

However, it's taken Google a number of years to get project Soli going as initially its radar system was unable to accurately pick up user gestures and had trouble isolating each motion. Google attributed these problems to the low power levels the smartwatch had to operate on due to FCC restrictions.

The tech giant applied for a waiver from the FCC to operate at higher power levels, something that was initially protested by Facebook as it claimed higher power levels could interfere with existing technology. This dispute has been settled by the two companies and Google has been granted a waiver after the FCC determined that project Soli could serve the public interest and had little potential for causing harm.

The approval means that Soli can move forward and create a new way to interact with technology. Due to the small size of the chips, it can be fitted into wearables, smartphones and many IoT devices.

Carsten Schwesig, the design lead of project Soli, said his team wanted to create virtual tools because they recognised that there are certain control actions, such as a pinch movement, that can be read fairly easily.

"Imagine a button between your thumb and index finger the button's not there, but pressing it is a very clear action and there is a very natural haptic feedback that occurs as you perform that action," he said.

"The hand can both embody a virtual tool and it can also be acting on the virtual tool at the same time. So if we can recognise that action, we have an interesting direction for interacting with technology."

Advertisement - Article continues below

There is currently no indication as to when the company plans to roll out the new technology.

Featured Resources

Digitally perfecting the supply chain

How new technologies are being leveraged to transform the manufacturing supply chain

Download now

Three keys to maximise application migration and modernisation success

Harness the benefits that modernised applications can offer

Download now

Your enterprise cloud solutions guide

Infrastructure designed to meet your company's IT needs for next-generation cloud applications

Download now

The 3 approaches of Breach and Attack Simulation technologies

A guide to the nuances of BAS, helping you stay one step ahead of cyber criminals

Download now
Advertisement

Most Popular

Visit/operating-systems/25802/17-windows-10-problems-and-how-to-fix-them
operating systems

17 Windows 10 problems - and how to fix them

13 Jan 2020
Visit/hardware/354584/windows-10-and-the-tools-for-agile-working
Sponsored

Windows 10 and the tools for agile working

20 Jan 2020
Visit/microsoft-windows/32066/what-to-do-if-youre-still-running-windows-7
Microsoft Windows

What to do if you're still running Windows 7

14 Jan 2020
Visit/business-strategy/public-sector/354608/uk-gov-launches-ps300000-sen-edtech-initiative
public sector

UK gov launches £300,000 SEN EdTech initiative

22 Jan 2020