Access the full text.
Sign up today, get DeepDyve free for 14 days.
References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.
This paper introduces a daily-partner robot, that is aware of the user’s situation by using gaze and utterance detection. For appropriate anthropomorphic interaction, the robot should talk to the user in proper timing without interrupting her/his task. Our proposed robot 1) estimates the user’s context (the target of her/his speech) by detecting his/his gaze the utterance, 2) expresses the need to speak to the user by silent gaze-turns towards the user and the object of joint attention (speech-implying behaviour) and 3) tells the message when the user talks to the robot. Based on preliminary results that show the sufficient human-sensitivity to the speech-implying behaviours of the robot, we evaluate the proposed behavioural model. The results show that the crossmodal awareness is effective for respectful communication that does not disturb the user’s ongoing task by silent behaviours that effectively show the robot’s intention to speak and draw the user’s attention.
International Journal of Autonomous and Adaptive Communications Systems – Inderscience Publishers
Published: Jan 1, 2012
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.