Abstract:
In this paper we present an augmented reality binocular system to allow long
range high precision augmentation of live telescopic imagery with aerial and
terrain based synthetic objects, vehicles, people and effects. The inserted
objects must appear stable in the display and must not jitter and drift as
the user pans around and examines the scene with the binoculars. The design
of the system is based on using two different cameras with wide field of
view, and narrow field of view lenses enclosed in a binocular shaped shell.
Using the wide field of view gives us context and enables us to recover the
3D location and orientation of the binoculars much more robustly, whereas the
narrow field of view is used for the actual augmentation as well as to
increase precision in tracking. We present our navigation algorithm that uses
the two cameras in combination with an IMU and GPS in an Extended Kalman
Filter (EKF) and provides jitter free, robust and real-time pose estimation
for precise augmentation. We have demonstrated successful use of our system
as part of a live simulated training system for observer training, in which
fixed and rotary wing aircrafts, ground vehicles, and weapon effects are
combined with real world scenes.
Social Program