Access the full text.
Sign up today, get DeepDyve free for 14 days.
J. Rudnick, G. Gaspari (2004)
Elements of the random walk
M. Srinivasan (1994)
An image-interpolation technique for the computation of optic flow and egomotionBiological Cybernetics, 71
N. Boeddeker, J. Hemmi (2010)
Visual gaze control during peering flight manoeuvres in honeybeesProceedings of the Royal Society B: Biological Sciences, 277
MV Srinivasan, S Thurrowgood, D Soccol (2009)
Flying insects and robots
David Diel, P. DeBitetto, S. Teller (2005)
Epipolar Constraints for Vision-Aided Inertial Navigation2005 Seventh IEEE Workshops on Applications of Computer Vision (WACV/MOTION'05) - Volume 1, 2
Jonathan Kelly, G. Sukhatme (2011)
Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibrationThe International Journal of Robotics Research, 30
D. Honegger, P. Greisen, Lorenz Meier, Petri Tanskanen, M. Pollefeys (2012)
Real-time velocity estimation based on optical flow and disparity matching2012 IEEE/RSJ International Conference on Intelligent Robots and Systems
P. Duhamel, N. Pérez-Arancibia, G. Barrows, R. Wood (2013)
Biologically Inspired Optical-Flow Sensing for Altitude Control of Flapping-Wing MicrorobotsIEEE/ASME Transactions on Mechatronics, 18
D. Honegger, Lorenz Meier, Petri Tanskanen, M. Pollefeys (2013)
An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications2013 IEEE International Conference on Robotics and Automation
S. Weiss, Markus Achtelik, M. Chli, R. Siegwart (2012)
Versatile distributed pose estimation and sensor self-calibration for an autonomous MAV2012 IEEE International Conference on Robotics and Automation
D. Floreano, Ramon Pericet-Camara, S. Viollet, F. Ruffier, A. Brückner, R. Leitel, W. Buss, M. Menouni, F. Expert, R. Juston, M. Dobrzynski, G. L'Eplattenier, Fabian Recktenwald, H. Mallot, N. Franceschini (2013)
Miniature curved artificial compound eyesProceedings of the National Academy of Sciences, 110
F. Schill, R. Mahony, Peter Corke (2009)
Estimating Ego-Motion in Panoramic Image Sequences with Inertial Measurements
Agostino Martinelli (2012)
Vision and IMU Data Fusion: Closed-Form Solutions for Attitude, Speed, Absolute Scale, and Bias DeterminationIEEE Transactions on Robotics, 28
Peter Corke, J. Lobo, J. Dias (2007)
An Introduction to Inertial and Visual SensingThe International Journal of Robotics Research, 26
Clark Taylor, M. Veth, J. Raquet, Mikel Miller (2011)
Comparison of Two Image and Inertial Sensor Fusion Techniques for Navigation in Unmapped EnvironmentsIEEE Transactions on Aerospace and Electronic Systems, 47
A. Davison (2003)
Real-time simultaneous localisation and mapping with a single cameraProceedings Ninth IEEE International Conference on Computer Vision
J. Koenderink, A. Doorn (1987)
Facts on optic flowBiological Cybernetics, 56
E. Jones, Stefano Soatto (2011)
Visual-inertial navigation, mapping and localization: A scalable real-time causal approachThe International Journal of Robotics Research, 30
MS Grewal, AP Andrews (2001)
Kalman filtering: Theory and practice using MATLAB
H. Krapp, R. Hengstenberg (1996)
Estimation of self-motion by optic flow processing in single visual interneuronsNature, 384
Roland Kern, N. Boeddeker, L. Dittmar, M. Egelhaaf (2012)
Blowfly flight characteristics are shaped by environmental features and controlled by optic flow informationJournal of Experimental Biology, 215
P. Bristeau, François Callou, D. Vissière, N. Petit (2011)
The Navigation and Control technology inside the AR.Drone micro UAVIFAC Proceedings Volumes, 44
DH Titterton (2004)
Strapdown inertial navigation technology
Timo Pylvänäinen (2008)
Automatic and adaptive calibration of 3D field sensorsApplied Mathematical Modelling, 32
(2005)
Zufferey obtained his Ph.D. in flying robotics from EPFL in 2005 for which he received the ABB best Ph.D
B. Hérissé, F. Russotto, T. Hamel, R. Mahony (2008)
Hovering flight and vertical landing control of a VTOL Unmanned Aerial Vehicle using optical flow2008 IEEE/RSJ International Conference on Intelligent Robots and Systems
B. Lucas, T. Kanade (1981)
An Iterative Image Registration Technique with an Application to Stereo Vision
M. Srinivasan, Shao-Wu Zhang, M. Lehrer, T. Collett (1996)
Honeybee navigation en route to the goal: visual flight control and odometryThe Journal of experimental biology, 199 Pt 1
F. Kendoul, I. Fantoni, K. Nonami (2009)
Optic flow-based vision system for autonomous 3D localization and control of small aerial vehiclesRobotics Auton. Syst., 57
M. Franz, J. Chahl, H. Krapp (2004)
Insect-Inspired Estimation of EgomotionNeural Computation, 16
(2000)
Bundle adjustment a modern synthesis. Vision algorithms: Theory and practice
R. Wood (2008)
The First Takeoff of a Biologically Inspired At-Scale Robotic InsectIEEE Transactions on Robotics, 24
Rendon Nelson, J. Aloimonos (1988)
Finding motion parameters from spherical motion fields (or the advantages of having eyes in the back of your head)Biological Cybernetics, 58
D. Scaramuzza, F. Fraundorfer (2011)
Visual Odometry [Tutorial]IEEE Robotics & Automation Magazine, 18
E. Baird, M. Srinivasan, Shao-Wu Zhang, A. Cowling (2005)
Visual control of flight speed in honeybeesJournal of Experimental Biology, 208
D. Lingaiah (2003)
Kalman filtering: Theory and practice using MATLAB, 2nd ed [Book Review]IEEE Circuits and Devices Magazine, 19
Lance Tammero, M. Dickinson (2002)
The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster.The Journal of experimental biology, 205 Pt 3
(2004)
2004).Multiple view geometry in computer vision (2nd ed.)
Yong-Sheng Chen, L. Liou, Y. Hung, C. Fuh (2001)
Three-dimensional ego-motion estimation from motion fields observed with multiple camerasPattern Recognit., 34
Clark Taylor (2009)
Enabling Navigation of MAVs through Inertial, Vision, and Air Pressure Sensor Fusion
Floris Breugel, M. Dickinson (2012)
The visual control of landing and obstacle avoidance in the fruit fly Drosophila melanogasterJournal of Experimental Biology, 215
A. Briod, J. Zufferey, D. Floreano (2013)
Optic-Flow Based Control of a 46g Quadrotor
Jonghyuk Kim, Galen Brambley (2007)
Dual Optic-flow Integrated Navigation forSmall-scale Flying Robots
Georg Klein, D. Murray (2007)
Parallel Tracking and Mapping for Small AR Workspaces2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Michael Blösch, S. Weiss, D. Scaramuzza, R. Siegwart (2010)
Vision based MAV navigation in unknown and unstructured environments2010 IEEE International Conference on Robotics and Automation
G Dissanayake, S Sukkarieh (2001)
The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applicationsIEEE Transactions on Robotics, 17
J. Zufferey, D. Floreano (2006)
Fly-inspired visual steering of an ultralight indoor aircraftIEEE Transactions on Robotics, 22
Antoine Beyeler, J. Zufferey, D. Floreano (2009)
Vision-based control of near-obstacle flightAutonomous Robots, 27
(2006)
Vision Based Hover in Place
Aleksandr Kushleyev, Vijay Kumar, Daniel Mellinger (2012)
Towards a swarm of agile micro quadrotorsAutonomous Robots, 35
Anastasios Mourikis, S. Roumeliotis (2007)
A Multi-State Constraint Kalman Filter for Vision-aided Inertial NavigationProceedings 2007 IEEE International Conference on Robotics and Automation
Y. Aloimonos, I. Weiss, A. Bandyopadhyay (2004)
Active visionInternational Journal of Computer Vision, 1
F. Ruffier, N. Franceschini (2005)
Optic flow regulation: the key to aircraft automatic guidanceRobotics Auton. Syst., 50
G. Dissanayake, S. Sukkarieh, E. Nebot, H. Durrant-Whyte (2001)
The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applicationsIEEE Trans. Robotics Autom., 17
B Triggs, P Mclauchlan, R Hartley, A Fitzgibbon (2000)
Vision algorithms: Theory and practice. Lecture notes in computer science
D. Scaramuzza, M. Achtelik, L. Doitsidis, F. Fraundorfer, E. Kosmatopoulos, Agostino Martinelli, Markus Achtelik, M. Chli, S. Chatzichristofis, L. Kneip, Daniel Gurdan, Lionel Heng, Gim Lee, Simon Lynen, M. Pollefeys, A. Renzaglia, R. Siegwart, J. Stumpf, Petri Tanskanen, C. Troiani, S. Weiss, Lorenz Meier (2014)
Vision-Controlled Micro Flying Robots: From System Design to Autonomous Navigation and Mapping in GPS-Denied EnvironmentsIEEE Robotics & Automation Magazine, 21
M. Srinivasan, Saul Thurrowgood, D. Soccol (2010)
From Visual Guidance in Flying Insects to Autonomous Aerial Vehicles
A. Briod, J. Zufferey, D. Floreano (2012)
Automatically calibrating the viewing direction of optic-flow sensors2012 IEEE International Conference on Robotics and Automation
S. Shen, Yash Mulgaonkar, Nathan Michael, Vijay Kumar (2013)
Vision-Based State Estimation and Trajectory Control Towards High-Speed Flight with a Quadrotor, 09
RI Hartley, A Zisserman (2004)
Multiple view geometry in computer vision
C. Schilstra, V. Hateren (1999)
Blowfly flight and optic flow. I. Thorax kinematics and flight dynamicsThe Journal of experimental biology, 202 (Pt 11)
(2004)
Strapdown inertial navigation technology (2nd ed.). London: The Institution of Engineering and Technology
S. Weiss, Markus Achtelik, Simon Lynen, M. Chli, R. Siegwart (2012)
Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments2012 IEEE International Conference on Robotics and Automation
(1988)
SM’06) received the M.A. and Ph.D. degrees from the University of Trieste, Trieste, Italy
Anthony Kim, M. Golnaraghit (2004)
A quaternion-based orientation estimation algorithm using an inertial measurement unitPLANS 2004. Position Location and Navigation Symposium (IEEE Cat. No.04CH37556)
F. Fraundorfer, D. Scaramuzza (2012)
Visual Odometry : Part II: Matching, Robustness, Optimization, and ApplicationsIEEE Robotics & Automation Magazine, 19
We aim at developing autonomous miniature hovering flying robots capable of navigating in unstructured GPS-denied environments. A major challenge is the miniaturization of the embedded sensors and processors that allow such platforms to fly by themselves. In this paper, we propose a novel ego-motion estimation algorithm for hovering robots equipped with inertial and optic-flow sensors that runs in real-time on a microcontroller and enables autonomous flight. Unlike many vision-based methods, this algorithm does not rely on feature tracking, structure estimation, additional distance sensors or assumptions about the environment. In this method, we introduce the translational optic-flow direction constraint, which uses the optic-flow direction but not its scale to correct for inertial sensor drift during changes of direction. This solution requires comparatively much simpler electronics and sensors and works in environments of any geometry. Here we describe the implementation and performance of the method on a hovering robot equipped with eight 0.65 g optic-flow sensors, and show that it can be used for closed-loop control of various motions.
Autonomous Robots – Springer Journals
Published: Sep 15, 2015
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.