Access the full text.
Sign up today, get DeepDyve free for 14 days.
S. Stellmach, Raimund Dachselt (2012)
Investigating gaze-supported multimodal pan and zoomProceedings of the Symposium on Eye Tracking Research and Applications
J. Turner, A. Bulling, Jason Alexander, Hans-Werner Gellersen (2014)
Cross-device gaze-supported point-to-point content transferProceedings of the Symposium on Eye Tracking Research and Applications
Mélodie Vidal, A. Bulling, Hans-Werner Gellersen (2013)
Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targetsProceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Jean-Daniel Fekete, N. Elmqvist, Y. Guiard (2009)
Motion-pointing: target selection using elliptical motionsProceedings of the SIGCHI Conference on Human Factors in Computing Systems
REFERENCES [HIGHLIGHTS]
D. Mould, C. Gutwin (2004)
The Effects of Feedback on Targeting with Multiple Moving Targets
Heiko Drewes, A. Schmidt (2007)
Interacting with the Computer Using Gaze Gestures
[HIGHLIGHTS] Mélodie Vidal Lancaster University Andreas Bulling Max Planck Institute for Informatics Hans Gellersen Lancaster University PURSUITS: Although gaze is an attractive modality for pervasive interaction, real-world implementation of eyebased interfaces poses significant challenges. In particular, user calibration is tedious and time consuming. Pursuits is an innovative interaction technique that enables truly spontaneous interaction with eye-based interfaces. A user can simply walk up to the screen and readily interact with moving targets. Instead of being based on gaze location, Pursuits correlates eye pursuit movements with objects dynamically moving on the interface. EYES FOR INTERACTION FIGURE 1. Pursuits matches the user's eye movement with the movement of on-screen objects. G aze holds great promise as an input modality because it indicates what our visual attention is directed at. It is particularly promising to interact with the increasing number of out-of-reach displays because our eyes naturally point at what we are interested in. As a result, eye tracking has attracted increasing interest for interaction. For example, Stellmach and Dachselt researched the use of the eyes to pan and zoom on maps [4] and Turner et al. looked into using the eyes to select outof-reach content on a display and move
ACM SIGMOBILE Mobile Computing and Communications Review – Association for Computing Machinery
Published: Jan 13, 2015
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.