Wide-Area Scene Mapping for Mobile Visual Tracking
Jonathan Ventura and Tobias Höllerer
University of California, Santa Barbara
Abstract
We propose a system for modeling an outdoor environment using an omnidirectional camera and then continuously estimating a camera phone's position with respect to the model. Our system evaluation shows that minimal user effort is required to initialize a camera tracking session in an unprepared urban environment. We combine panoramas captured using an omnidirectional camera from several viewpoints to create a point cloud model. After the offline modeling step, live camera pose tracking is initialized by feature point matching, and continuously updated by aligning the point cloud model to the camera image. In contrast to camera-based simultaneous localization and mapping (SLAM) systems, our methods are suitable for handheld use in large outdoor spaces. Our vision-based system enables mobile augmented reality applications with better position accuracy and update rate than possible with satellite triangulation.
Video
Publications
Ventura, J., and T. Höllerer, Outdoor Mobile Localization from Panoramic Imagery, International Symposium on Mixed and Augmented Reality, 2011.
Ventura, J., and T. Höllerer, Wide-Area Scene Mapping for Mobile Visual Tracking, International Symposium on Mixed and Augmented Reality, 2012.
Ventura, J., Wide-Area Visual Modeling and Tracking for Mobile Augmented Reality, Ph.D. Dissertation, Computer Science, UC Santa Barbara, 2012.
Code
Some of the code from this project is available under BSD license at GitHub.
Dataset
- Village dataset: images