Direkt zum Inhalt springen
Computer Vision Group
Faculty of Informatics
Technical University of Munich

Technical University of Munich

Menu
Home Data Datasets SLAM for Omnidirectional Cameras

Large-Scale Direct SLAM for Omnidirectional Cameras

We propose a real-time, direct monocular SLAM method for omnidirectional or wide field-of-view fisheye cam- eras. Both tracking (direct image alignment) and mapping (pixel-wise distance filtering) are directly formulated for the unified omnidirectional model, which can model central imaging devices with a field of view well above 150° . This is in stark contrast to existing direct mono-SLAM approaches like DTAM or LSD-SLAM, which operate on rectified images, limiting the field of view to well below 180° . Not only does this allow to observe – and reconstruct – a larger portion of the surrounding environment, but it also makes the system more robust to degenerate (rotation-only) movement. The two main contribution are (1) the formulation of direct image alignment for the unified omnidirectional model, and (2) a fast yet accurate approach to incremental stereo directly on distorted images. We evaluated our framework on real-world sequences taken with a 185◦ fish-eye lens, and compare it to a rectified and a piecewise rectified approach.

Video

Dataset

License

Unless stated otherwise, all data in the SLAM for Omnidirectional Cameras Dataset is licensed under a Creative Commons 4.0 Attribution License (CC BY 4.0).

Related publications

Conference and Workshop Papers
2015
Large-Scale Direct SLAM for Omnidirectional Cameras (D. Caruso, J. Engel, D. Cremers), In International Conference on Intelligent Robots and Systems (IROS), 2015. [bib] [pdf] [video]
Powered by bibtexbrowser
Export as PDF or BIB

Rechte Seite

Informatik IX
Chair for Computer Vision & Artificial Intelligence

Boltzmannstrasse 3
85748 Garching

info@vision.in.tum.de