Access the full text.
Sign up today, get DeepDyve free for 14 days.
Jakob Engel, Thomas Schöps, D. Cremers (2014)
LSD-SLAM: Large-Scale Direct Monocular SLAM
S. Shen, Nathan Michael, Vijay Kumar (2015)
Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs2015 IEEE International Conference on Robotics and Automation (ICRA)
Manohar Kuse, S. Shen (2016)
Robust camera motion estimation using direct edge alignment and sub-gradient method2016 IEEE International Conference on Robotics and Automation (ICRA)
S. Rusinkiewicz, M. Levoy (2001)
Efficient variants of the ICP algorithmProceedings Third International Conference on 3-D Digital Imaging and Modeling
(2004)
Lucas–Kanade20years on:Aunifying framework
Albert Huang, A. Bachrach, Peter Henry, Michael Krainin, Daniel Maturana, D. Fox, N. Roy (2011)
Visual Odometry and Mapping for Autonomous Flight Using an RGB-D Camera
Javier Civera, Oscar Grasa, Andrew Davison, José Montiel (2010)
1-Point RANSAC for extended Kalman filtering: Application to real-time structure from motion and visual odometryJournal of Field Robotics, 27
Jianbo Shi, Carlo Tomasi (1994)
Good features to track1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition
Yonggen Ling, Tianbo Liu, S. Shen (2016)
Aggressive quadrotor flight using dense visual-inertial fusion2016 IEEE International Conference on Robotics and Automation (ICRA)
Jakob Engel, Jürgen Sturm, D. Cremers (2013)
Semi-dense Visual Odometry for a Monocular Camera2013 IEEE International Conference on Computer Vision
D. Lowe (2004)
Distinctive Image Features from Scale-Invariant KeypointsInternational Journal of Computer Vision, 60
Matthew Crosby, Ronald Petrick (2014)
Association for the Advancement of Artificial Intelligence
Richard Newcombe, S. Lovegrove, A. Davison (2011)
DTAM: Dense tracking and mapping in real-time2011 International Conference on Computer Vision
Zhenfei Yang, S. Shen (2015)
Monocular visual-inertial fusion with online initialization and camera-IMU calibration2015 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)
Tue-Cuong Dong-Si, Anastasios Mourikis (2012)
Estimator initialization in vision-aided inertial navigation with unknown camera-IMU calibration2012 IEEE/RSJ International Conference on Intelligent Robots and Systems
(2016)
Tightly-coupled Visual-Inertial Sensor Fusion based on IMU Pre-Integration
Lionel Heng, Gim Lee, M. Pollefeys (2014)
Self-calibration and visual SLAM with a multi-camera system on a micro aerial vehicleAutonomous Robots, 39
C. Kerl, Jürgen Sturm, D. Cremers (2013)
Robust odometry estimation for RGB-D cameras2013 IEEE International Conference on Robotics and Automation
Y. Ma, Stefano Soatto, Jana Koseck, S. Sastry (2003)
An Invitation to 3-D Vision: From Images to Geometric Models
C. Harris, J. Pike (1988)
3D positional integration from image sequencesImage Vis. Comput., 6
Christian Forster, L. Carlone, F. Dellaert, D. Scaramuzza (2015)
IMU Preintegration on Manifold for Efficient Visual-Inertial Maximum-a-Posteriori Estimation, 11
Yonggen Ling, S. Shen (2015)
Dense visual-inertial odometry for tracking of aggressive motions2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)
D. Scaramuzza, M. Achtelik, L. Doitsidis, F. Fraundorfer, E. Kosmatopoulos, Agostino Martinelli, Markus Achtelik, M. Chli, S. Chatzichristofis, L. Kneip, Daniel Gurdan, Lionel Heng, Gim Lee, Simon Lynen, M. Pollefeys, A. Renzaglia, R. Siegwart, J. Stumpf, Petri Tanskanen, C. Troiani, S. Weiss, Lorenz Meier (2014)
Vision-Controlled Micro Flying Robots: From System Design to Autonomous Navigation and Mapping in GPS-Denied EnvironmentsIEEE Robotics & Automation Magazine, 21
E. Rosten, T. Drummond (2006)
Machine Learning for High-Speed Corner Detection
M. Burri, J. Nikolic, Pascal Gohl, T. Schneider, J. Rehder, Sammy Omari, Markus Achtelik, R. Siegwart (2016)
The EuRoC micro aerial vehicle datasetsThe International Journal of Robotics Research, 35
S. Shen, Yash Mulgaonkar, Nathan Michael, Vijay Kumar (2013)
Vision-Based State Estimation and Trajectory Control Towards High-Speed Flight with a Quadrotor, 09
G. Huang, M. Kaess, J. Leonard (2014)
Towards consistent visual-inertial navigation2014 IEEE International Conference on Robotics and Automation (ICRA)
Mingyang Li, Anastasios Mourikis (2013)
High-precision, consistent EKF-based visual-inertial odometryThe International Journal of Robotics Research, 32
Joel Hesch, Dimitrios Kottas, Sean Bowman, S. Roumeliotis (2014)
Consistency Analysis and Improvement of Vision-aided Inertial NavigationIEEE Transactions on Robotics, 30
J. Stückler, Sven Behnke (2012)
Model Learning and Real-Time Tracking Using Multi-Resolution Surfel MapsProceedings of the AAAI Conference on Artificial Intelligence
Andreas Geiger, Philip Lenz, R. Urtasun (2012)
Are we ready for autonomous driving? The KITTI vision benchmark suite2012 IEEE Conference on Computer Vision and Pattern Recognition
Stefan Leutenegger, P. Furgale, V. Rabaud, M. Chli, K. Konolige, R. Siegwart (2013)
Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization, 09
Pedro Felzenszwalb, D. Huttenlocher (2012)
Distance Transforms of Sampled FunctionsTheory Comput., 8
V. Usenko, Jakob Engel, J. Stückler, D. Cremers (2016)
Direct visual-inertial odometry with stereo cameras2016 IEEE International Conference on Robotics and Automation (ICRA)
A. Fitzgibbon (2003)
Robust Registration of 2D and 3D Point SetsImage Vis. Comput., 21
Sammy Omari, Michael Blösch, Pascal Gohl, R. Siegwart (2015)
Dense visual-inertial navigation system for mobile robots2015 IEEE International Conference on Robotics and Automation (ICRA)
Binoy Pinto (2011)
Speeded Up Robust Features
(2014)
Initialization-freemonocular visual–inertial estimationwith application to autonomous MAVs
C. Tomasi (1991)
Detection and Tracking of Point Features, 1
Michael Bloesch, Sammy Omari, M. Hutter, R. Siegwart (2015)
Robust visual inertial odometry using a direct EKF-based approach2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Aleksandr Segal, D. Hähnel, S. Thrun (2009)
Generalized-ICP
We propose a novel edge-based visual–inertial fusion approach to address the problem of tracking aggressive motions with real-time state estimates. At the front-end, our system performs edge alignment, which estimates the relative poses in the distance transform domain with a larger convergence basin and stronger resistance to changing lighting conditions or camera exposures compared to the popular direct dense tracking. At the back-end, a sliding-window optimization-based framework is applied to fuse visual and inertial measurements. We utilize efficient inertial measurement unit (IMU) preintegration and two-way marginalization to generate accurate and smooth estimates with limited computational resources. To increase the robustness of our proposed system, we propose to perform an edge alignment self check and IMU-aided external check. Extensive statistical analysis and comparison are presented to verify the performance of our proposed approach and its usability with resource-constrained platforms. Comparing to state-of-the-art point feature-based visual–inertial fusion methods, our approach achieves better robustness under extreme motions or low frame rates, at the expense of slightly lower accuracy in general scenarios. We release our implementation as open-source ROS packages.
Autonomous Robots – Springer Journals
Published: Jul 1, 2017
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.