Access the full text.
Sign up today, get DeepDyve free for 14 days.
V. Ng-Thow-Hing, K. Thórisson, Ravi Sarvadevabhatla, J. Wormer, T. List (2009)
Cognitive map architectureIEEE Robotics & Automation Magazine, 16
Jet Propulsion Laboratory/Caltech
Multimodal Approach to Affective Human-Robot Interaction Design with
Sandra Okita, V. Ng-Thow-Hing, Ravi Sarvadevabhatla (2009)
Learning together: ASIMO developing an interactive learning partnership with childrenRO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication
Bilge Mutlu, Steven Osman, J. Forlizzi, J. Hodgins, S. Kiesler (2006)
Task Structure and User Attributes as Elements of Human-Robot Interaction DesignROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication
C. Bartneck, J. Forlizzi (2004)
Shaping human-robot interaction: understanding the social aspects of intelligent robotic products
S. Gelman, G. Gottfried (1996)
Children's causal explanations of animate and inanimate motionChild Development, 67
B. Ng-thow-hing, K. Thórisson, Ravi Sarvadevabhatla, J. Wormer, T. List (2009)
Cognitive Map Architecture Facilitation of Human–Robot Interaction in Humanoid Robots
Ravi Sarvadevabhatla, V. Ng-Thow-Hing, Sandra Okita (2010)
Extended duration human-robot interaction: Tools and analysis19th International Symposium in Robot and Human Interactive Communication
P. Andersen (1998)
Nonverbal Communication: Forms and Functions
A. Andrew (2003)
Humans and Automation: System Design and Research IssuesKybernetes, 32
Amanda Rosenberg, Mihwa Kim, and Yumiko Murai, and Maya Fujimura from University of California, Irvine who helped with transcribing and initial coding of the data
George Ray, Kory Floyd (2006)
Nonverbal Expressions of Liking and Disliking in Initial Interaction: Encoding and Decoding PerspectivesSouthern Communication Journal, 71
K. Wada, T. Shibata, Tomoko Saito, K. Tanie (2002)
Analysis of factors that bring mental effects to elderly people in robot assisted activityIEEE/RSJ International Conference on Intelligent Robots and Systems, 2
T. Sheridan (2002)
Humans and Automation: System Design and Research Issues
Hanako Yoshida, Linda Smith (2008)
What's in View for Toddlers? Using a Head Camera to Study Visual Experience.Infancy : the official journal of the International Society on Infant Studies, 13 3
T. Kanda, Takayuki Hirano, Daniel Eaton, H. Ishiguro (2004)
Interactive Robots as Social Partners and Peer Tutors for Children: A Field TrialHuman–Computer Interaction, 19
K. Dautenhahn, I. Werry (2002)
A quantitative technique for analysing robot-human interactionsIEEE/RSJ International Conference on Intelligent Robots and Systems, 2
M. Poh, N. Swenson, Rosalind Picard (2010)
A Wearable Sensor for Unobtrusive, Long-Term Assessment of Electrodermal ActivityIEEE Transactions on Biomedical Engineering, 57
Ravi Sarvadevabhatla, V. Ng-Thow-Hing (2009)
Panoramic attention for humanoid robots2009 9th IEEE-RAS International Conference on Humanoid Robots
Sandra Okita, Daniel Schwartz (2006)
Young Children's Understanding of Animacy and Entertainment RobotsInt. J. Humanoid Robotics, 3
(2000)
ASIMO year 2000 model
Kris Hauser, V. Ng-Thow-Hing (2011)
Randomized multi-modal motion planning for a humanoid robot manipulation taskThe International Journal of Robotics Research, 30
K. Scherer, James Oshinsky (1977)
Cue utilization in emotion attribution from auditory stimuliMotivation and Emotion, 1
A. Arsénio (2004)
Children, Humanoid Robots and Caregivers
B. Robins, K. Dautenhahn, R. Boekhorst, A. Billard (2004)
Effects of repeated exposure to a humanoid robot on children with autism
Kris Hauser, V. Ng-Thow-Hing, H. González-Baños (2007)
Multi-modal Motion Planning for a Humanoid Robot Manipulation Task
(2010)
Received December
V. Ng-Thow-Hing, Pengcheng Luo, Sandra Okita (2010)
Synchronized gesture and speech production for humanoid robots2010 IEEE/RSJ International Conference on Intelligent Robots and Systems
Multimodal Approach to Affective Human-Robot Interaction Design with Children SANDRA Y. OKITA, Teachers College, Columbia University VICTOR NG-THOW-HING and RAVI K. SARVADEVABHATLA, Honda Research Institute Two studies examined the different features of humanoid robots and the in uence on children s affective behavior. The rst study looked at interaction styles and general features of robots. The second study looked at how the robot s attention in uences children s behavior and engagement. Through activities familiar to young children (e.g., table setting, story telling), the rst study found that cooperative interaction style elicited more oculesic behavior and social engagement. The second study found that quality of attention, type of attention, and length of interaction in uences affective behavior and engagement. In the quality of attention, Wizard-of-Oz (woz) elicited the most affective behavior, but automatic attention worked as well as woz when the interaction was short. The type of attention going from nonverbal to verbal attention increased children s oculesic behavior, utterance, and physiological response. Affective interactions did not seem to depend on a single mechanism, but a well-chosen con uence of technical features. Categories and Subject Descriptors: H.5.2 [Information Interfaces and Presentation]: User Interfaces Evaluation/methodology; theory and methods;
ACM Transactions on Interactive Intelligent Systems (TiiS) – Association for Computing Machinery
Published: Oct 1, 2011
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.