Elliptic Labs has launched an ultrasound touchless gesturing software development kit (SDK) for Android smartphones as it looks to push devices that respond to natural hand movements.
Norway-based Elliptic, which launched the SDK at the CREATEC conference in Japan this week, has been working on touchless gesturing technology for around eight years.
CEO Laila Danielsen told Mobile Europe that the technology is able to work with any ARM-based smartphone.
“It’s a revolutionising technology – place your phone on a table and place your hand under the table, and with a scroll gesture you can access your emails. You can watch videos with a single swipe movement in mid-air,” she explained.
Touchless operation in smartphones is not new – a number of smartphone companies including Samsung and Sony have released phones in the past with mid-air gesture support.
However, there are not many such phones on the market currently and Danielsen claims Elliptic’s ultrasound-based technology is “far superior” to Samsung’s touchless technology, which uses infrared (IR).
According to Danielsen, IR is “limited” in comparison to ultrasound – it requires the user to make a gesture in front of the IR sensor, for example – and consumes more battery life.
“With ultrasound you [can] make gestures all around your device and it will still respond – not just in front of the screen – and it’s not as heavy duty as IR,” she said.
Ultrasound technology also has an advantage over camera-based gesture technologies like Leap Motion, according to Danielsen. Leap Motion requires the user to place their hand within the tight viewing range of the phone’s in-built camera.
“Ultrasound technology consumes 95 percent less power than [Leap Motion does] and works great in low-light conditions,” explained Danielsen.
“Our technology works with a sensor hidden within the screen of the phone and it can detect motion up to a distance of 50cm.”
Apart from the motion sensor, touchless gesturing is achieved through the use of a mobile device’s standard in-built microphone. The ultrasonic transducer first detects the user’s hand with ultrasound waves.
Then, the ultrasound rays bounce back to the microphone, which picks up the signal that is then converted into gestures by the vendor’s software.
The sensor also has the ability to detect foreground motions and ignore background gestures. “This way, it wouldn’t confuse the user and no-one else can use the device apart from the user who is in front of it,” explained Danielsen.
Moving beyond mobile, the CEO said that the technology can be used on PCs, navigation systems and wearable devices such as watches and iPod.
Elliptic says it can already demonstrate how the technology can be incorporated into the Windows 8 Gesture Suite enabling a touchless version of all touchscreen gestures in the operating system.
“The possibilities are endless,” said the CEO. “We are working on cross-device gesturing where two ultrasound-enabled phones can detect each other and [enable] users [to] send pictures and share information by gestures, and then there is proximity sensing where you can turn your phone on when you start walking in your room.”
While Elliptic may have the lead by releasing a touchless SDK for smartphones, competitors in the ultrasound tech field are coming – most notably chipset manufacturer Qualcomm, which acquired digital ultrasound company EPOS last November.