Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Improving multi-modal data fusion by anomaly detection

Improving multi-modal data fusion by anomaly detection If we aim for autonomous navigation of a mobile robot, it is crucial and essential to have proper state estimation of its position and orientation. We already designed a multi-modal data fusion algorithm that combines visual, laser-based, inertial, and odometric modalities in order to achieve robust solution to a general localization problem in challenging Urban Search and Rescue environment. Since different sensory modalities are prone to different nature of errors, and their reliability varies vastly as the environment changes dynamically, we investigated further means of improving the localization. The common practice related to the EKF-based solutions such as ours is a standard statistical test of the observations—or of its corresponding filter residuals—performed to reject anomalous data that deteriorate the filter performance. In this paper we show how important it is to treat well visual and laser anomalous residuals, especially in multi-modal data fusion systems where the frequency of incoming observations varies significantly across the modalities. In practice, the most complicated part is to correctly identify the actual anomalies, which are to be rejected, and therefore here lies our major contribution. We go beyond the standard statistical tests by exploring different state-of-the-art machine learning approaches and exploiting our rich dataset that we share with the robotics community. We demonstrate the implications of our research both indoor (with precise reference from a Vicon system) as well as in challenging outdoor environment. In the final, we prove that monitoring the health of the observations in Kalman filtering is something, that is often overlooked, however, it definitively should not be. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Autonomous Robots Springer Journals

Improving multi-modal data fusion by anomaly detection

Loading next page...
 
/lp/springer-journals/improving-multi-modal-data-fusion-by-anomaly-detection-4ue2tZHRBP

References (68)

Publisher
Springer Journals
Copyright
Copyright © 2015 by Springer Science+Business Media New York
Subject
Engineering; Robotics and Automation; Artificial Intelligence (incl. Robotics); Computer Imaging, Vision, Pattern Recognition and Graphics; Control, Robotics, Mechatronics
ISSN
0929-5593
eISSN
1573-7527
DOI
10.1007/s10514-015-9431-6
Publisher site
See Article on Publisher Site

Abstract

If we aim for autonomous navigation of a mobile robot, it is crucial and essential to have proper state estimation of its position and orientation. We already designed a multi-modal data fusion algorithm that combines visual, laser-based, inertial, and odometric modalities in order to achieve robust solution to a general localization problem in challenging Urban Search and Rescue environment. Since different sensory modalities are prone to different nature of errors, and their reliability varies vastly as the environment changes dynamically, we investigated further means of improving the localization. The common practice related to the EKF-based solutions such as ours is a standard statistical test of the observations—or of its corresponding filter residuals—performed to reject anomalous data that deteriorate the filter performance. In this paper we show how important it is to treat well visual and laser anomalous residuals, especially in multi-modal data fusion systems where the frequency of incoming observations varies significantly across the modalities. In practice, the most complicated part is to correctly identify the actual anomalies, which are to be rejected, and therefore here lies our major contribution. We go beyond the standard statistical tests by exploring different state-of-the-art machine learning approaches and exploiting our rich dataset that we share with the robotics community. We demonstrate the implications of our research both indoor (with precise reference from a Vicon system) as well as in challenging outdoor environment. In the final, we prove that monitoring the health of the observations in Kalman filtering is something, that is often overlooked, however, it definitively should not be.

Journal

Autonomous RobotsSpringer Journals

Published: Jan 21, 2015

There are no references for this article.