Direkt zum Inhalt springen
Computer Vision Group
TUM Department of Informatics
Technical University of Munich

Technical University of Munich

Home Data Datasets SLAM for Omnidirectional Cameras

Large-Scale Direct SLAM for Omnidirectional Cameras

We propose a real-time, direct monocular SLAM method for omnidirectional or wide field-of-view fisheye cam- eras. Both tracking (direct image alignment) and mapping (pixel-wise distance filtering) are directly formulated for the unified omnidirectional model, which can model central imaging devices with a field of view well above 150° . This is in stark contrast to existing direct mono-SLAM approaches like DTAM or LSD-SLAM, which operate on rectified images, limiting the field of view to well below 180° . Not only does this allow to observe – and reconstruct – a larger portion of the surrounding environment, but it also makes the system more robust to degenerate (rotation-only) movement. The two main contribution are (1) the formulation of direct image alignment for the unified omnidirectional model, and (2) a fast yet accurate approach to incremental stereo directly on distorted images. We evaluated our framework on real-world sequences taken with a 185◦ fish-eye lens, and compare it to a rectified and a piecewise rectified approach.




Unless stated otherwise, all data in the SLAM for Omnidirectional Cameras Dataset is licensed under a Creative Commons 4.0 Attribution License (CC BY 4.0).

Related publications

Export as PDF, TEX or BIB

Conference and Workshop Papers
[]Large-Scale Direct SLAM for Omnidirectional Cameras (D. Caruso, J. Engel and D. Cremers), In International Conference on Intelligent Robots and Systems (IROS), 2015.  [bibtex] [pdf] [video]
Powered by bibtexbrowser
Export as PDF, TEX or BIB

Rechte Seite

Informatik IX
Chair of Computer Vision & Artificial Intelligence

Boltzmannstrasse 3
85748 Garching