Working with IMU for Visual-Inertial Simultaneous Location and Mapping

I am close to releasing my Augmented Reality headset to the general public. But before that, the system needed to provide amazing experiences. One the cool experiences you can get with AR is when the device you are using knows its own motion in 3D space, it allows you to rotate or move around a fixed virtual object in your reality.

This is the major feature of both ARKit and ARCore, that are available in Android and iOS phones to provide AR experiences on your phone.

There are still some challenges to tackle in AR, like occlusion, but we will talk about that later.

I developed the same feature for Lynx, and it uses visual information from one the cameras and inertial information from the same kind of sensor that is shown in video. By combining both visual and inertial data, we now have a very robust tracking system ready to provide great experiences.

This article is my 17th oldest. It is 166 words long

© 2019 Stan Larroque. All rights reserved.