Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Spartacus attending the 2005 AAAI conference

Spartacus attending the 2005 AAAI conference Spartacus is our robot entry in the 2005 AAAI Mobile Robot Challenge, making a robot attend the National Conference on Artificial Intelligence. Designing robots that are capable of interacting with humans in real-life settings can be considered the ultimate challenge when it comes to intelligent autonomous systems. One key issue is the integration of multiple modalities (e.g., mobility, physical structure, navigation, vision, audition, dialogue, reasoning). Such integration increases the diversity and also the complexity of interactions the robot can generate. It also makes it difficult to monitor how such increased capabilities are used in unconstrained conditions, whether it is done while the robot is in operation of afterwards. This paper reports solutions and findings resulting from our hardware, software and decisional integration work on Spartacus. It also outlines perspectives in making intelligent and interaction capabilities evolve for autonomous robots. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Autonomous Robots Springer Journals

Loading next page...
 
/lp/springer-journals/spartacus-attending-the-2005-aaai-conference-9XPAb7KikG

References (47)

Publisher
Springer Journals
Copyright
Copyright © 2006 by Springer Science+Business Media, LLC
Subject
Engineering; Robotics and Automation; Artificial Intelligence (incl. Robotics); Computer Imaging, Vision, Pattern Recognition and Graphics; Control, Robotics, Mechatronics
ISSN
0929-5593
eISSN
1573-7527
DOI
10.1007/s10514-006-9014-7
Publisher site
See Article on Publisher Site

Abstract

Spartacus is our robot entry in the 2005 AAAI Mobile Robot Challenge, making a robot attend the National Conference on Artificial Intelligence. Designing robots that are capable of interacting with humans in real-life settings can be considered the ultimate challenge when it comes to intelligent autonomous systems. One key issue is the integration of multiple modalities (e.g., mobility, physical structure, navigation, vision, audition, dialogue, reasoning). Such integration increases the diversity and also the complexity of interactions the robot can generate. It also makes it difficult to monitor how such increased capabilities are used in unconstrained conditions, whether it is done while the robot is in operation of afterwards. This paper reports solutions and findings resulting from our hardware, software and decisional integration work on Spartacus. It also outlines perspectives in making intelligent and interaction capabilities evolve for autonomous robots.

Journal

Autonomous RobotsSpringer Journals

Published: Dec 27, 2006

There are no references for this article.