Eonite’s inside-out tracking tech could be VR’s next big breakthrough
I stood in someone else’s living room, do squat. All this is real. How do I know? Because around me, small UAVs circled like flies, every time I look at one, depending on what I shoot down to the point where I can not control.
It is said that this may be one of the most bizarre virtual reality experiences, but technically significant. I wore HTC Vive but no tracker in the room, only a headset attached to the front. If it were not for the line, I could have wandered in without being told I was happy. Eonite thinks that the sensor on it, on my face, is a breakthrough and it may be right.
From the inside out position tracking, motion tracking headphones do not have any external sensors now in virtual reality. Because it is more challenging, but more easily built from the inside out, put on headphones and you go, more intuitive hell. “We humans designed the tracker from the inside out,” jokes Eonite founder Youssri Helmy.
We tried VR from inside out before the recent resume of Pico Neo. But most systems on the market, including the “Big Three”, use the limits of the vision from outside to inside, that is, even without wires playing space is defined.
“We’re about making machines see the world,” says Helmy. Eonite’s other founder is mathematician and roboticist Anna Petrovskaya, who was part of the core team that built the Stanford autonomous car Junior – which became the Google car. Her algorithm was built to exact more signal from noisy sensors, and it’s this that laid the foundation for Eonite, which she started with Helmy.
What makes Eonite’s technology such a breakthrough, according to Helmy, is that it balances power consumption, cost and accuracy in a way that hasn’t before been done. It also functions well in low light conditions, as I discover when using it in a poorly lit hotel room.
As I lean down to peek the underside of the coffee table in this virtual living room, I appreciate that this is the most precise inside-out tracking I’ve tried. Eonite’s sensor provide sub-millimetre accuracy and six degrees of freedom (6DOF), and latency is very good compared to other inside-out systems I’ve tried. Aside from the odd judder, which I’m told is down to it skipping from 6DOF to 3DOF (something they said would not be a problem in the finished product) the experience is mighty impressive.
There’s potential for augmented reality here too, although there’s an inherent latency in the many of the camera passthroughs that would need to be solved first.
Eonite is now ready to show the world what it’s been building, but Helmy tells me that in the first quarter of the year it will announce it’s working with a partner to bring a headset to market. However Eonite won’t be making any systems of its own.
It’s firstly targeting tethered PC headsets, but plans to move to mobile in the near future – and is even looking at the potential of this vision tech for drones and robots. Screw the robots – they can wait – VR needs this right now.