When you arrive home from Mobile World Congress people tend to ask the same question.
What impressed you most? What’s the next big thing?
It’s always a tricky one, but last year there was one definite winner for me – Elliptic Labs. I actually used the words “The Next Big Thing” in the article I wrote about them.
Basically put, they have produced a touchless interface which hardly uses any power, runs using existing hardware and “sees” where your hand is. This year things have improved. It’s faster and the sensors detecting your hands have moved to the side of the device so that they can be integrated without any real hassle.
I spoke to Haakon Bryhni, CTO of Elliptic Labs. He’s a PhD and told me that they have a large company who is using this system. Who it is, he wouldn’t be drawn, but you can place bets on a fairly sizeable smartphone manufacturer releasing a phone with this technology sometime in the next 12 months.
Haakon told me about how they’ve improved the technology to be faster and use newly-placed sensors to work. They had a couple of devices on display with the clever hand-sensing technology. The first here shows the selfie-snapping tech, which relies on a very fluid, natural and almost “care free” movement of your hand. It doesn’t matter where your hand is and it doesn’t rely on cameras or any battery-sucking system like that. Check out this video, where a selfie shot is initiated with a brief off-screen wave..
The X, Y and Z axis are all covered, and the phone can respond based on the proximity of your hand to the screen in a very analogue manner. This basically means that you can slowly move your hand closer to the screen and have it interact with your device in a different manner. Move your house fairly near and you can see some notifications. Move it closer still and you can open the notification etc.
Here’s Haakon playing a game using just the proximity ..
Yes, other phones will have solutions which will involve existing proximity sensors, but this uses super battery-friendly technology which you don’t have to wake the screen to get going.
It’s not just photos that you can take via an action, you can also stop or start music..
Laila Danielsen (CEO) and the Elliptic Labs team admit that they’re hard-core software geeks at heart, and they’ve needed a hardware partner to take them to the next level. It appears that this has now happened, and now the applications for this technology are literally endless. Whether it be on a smartphone, tablet, LED panel or smartwatch. Full hand tracking and granular analogue proximity and sensing are possible.
We should just mention that the case shown on these devices is just a temporary solution and, following the partnership with the un-named hardware partner, this will vanish completely and the solution will be built into future devices without any additional bulk to the phone. You won’t notice it’s there. It’ll just work.
The system works using radar and the field of vision is beyond anything that a camera could cope with.
As I said last year. Mark my words, this technology will be the next big thing. Expect it in a big launch of a major manufacturer soon.