Access the full text.
Sign up today, get DeepDyve free for 14 days.
Jiang Gao, R. Collins, Alexander Hauptmann, H. Wactlar (2004)
Articulated Motion Modeling for Activity Analysis2004 Conference on Computer Vision and Pattern Recognition Workshop
Chih-Wei Hsu, Chih-Chung Chang, Chih-Jen Lin (2008)
A Practical Guide to Support Vector Classication
Alper Yilmaz, M. Shah (2005)
Recognizing human actions in videos acquired by uncalibrated moving camerasTenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1, 1
Chunmei Lu, N. Ferrier (2004)
Repetitive motion analysis: segmentation and event classificationIEEE Transactions on Pattern Analysis and Machine Intelligence, 26
(2007)
SmartPendant: An intelligent device for human activity recognition and location tracking
Zsolt Husz, A. Wallace, P. Green (2007)
Human activity recognition with action primitives2007 IEEE Conference on Advanced Video and Signal Based Surveillance
Jonathan Lester, Tanzeem Choudhury, G. Borriello (2006)
A Practical Approach to Recognizing Physical Activities
Bao L. (2005)
1Lecture Notes in Computer Science, 3001
Gengyu Ma, X. Lin (2004)
Typical Sequences Extraction and Recognition
Andreas Krause, D. Siewiorek, A. Smailagic, J. Farringdon (2003)
Unsupervised, dynamic identification of physiological and activity context in wearable computingSeventh IEEE International Symposium on Wearable Computers, 2003. Proceedings.
J. Barron, David Fleet, S. Beauchemin (1992)
Performance of optical flow techniquesInternational Journal of Computer Vision, 12
Ling Bao, S. Intille (2004)
Activity Recognition from User-Annotated Acceleration Data
Nicky Kern, B. Schiele, A. Schmidt (2003)
Multi-sensor Activity Context Detection for Wearable Computing
K. Souhila, Achour Karim (2007)
Optical Flow Based Robot Obstacle AvoidanceInternational Journal of Advanced Robotic Systems, 4
R. Zafarani, M. Yazdchi (2007)
A Novel Action Selection Architecture in Soccer Simulation Environment Using Neuro-Fuzzy and Bidirectional Neural NetworksInternational Journal of Advanced Robotic Systems, 4
N. Ravi, Nikhil Dandekar, P. Mysore, M. Littman (2005)
Activity Recognition from Accelerometer Data
R. DeVaul, S. Dunn (2001)
Real-time motion classi ca-tion for wearable computing applications
B. Lucas, T. Kanade (1981)
An Iterative Image Registration Technique with an Application to Stereo Vision
M. Lösch, Sven Schmidt-Rohr, S. Knoop, Stefan Vacek, R. Dillmann (2007)
Feature Set Selection and Optimal Classifier for Human Activity RecognitionRO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication
J. Caros, O. Chételat, P. Celka, S. Dasen (2005)
Very Low Complexity Algorithm for Ambulatory Activity Classification
B. Clarkson, K. Mase, A. Pentland (2000)
Recognizing user context via wearable sensorsDigest of Papers. Fourth International Symposium on Wearable Computers
T. Nakata (2006)
Recognizing Human Activities in Video by Multi-resolutional Optical Flows2006 IEEE/RSJ International Conference on Intelligent Robots and Systems
R. Polana, R. Nelson (1997)
Detection and Recognition of Periodic, Nonrigid MotionInternational Journal of Computer Vision, 23
Ross Cutler, M. Turk (1998)
View-based interpretation of real-time optical flow for gesture recognitionProceedings Third IEEE International Conference on Automatic Face and Gesture Recognition
Tâm Huynh, Ulf Blanke, B. Schiele (2007)
Scalable Recognition of Daily Activities with Wearable Sensors
Yongwon Cho, Yunyoung Nam, Yoo-Joo Choi, W. Cho (2008)
SmartBuckle: human activity recognition using a 3-axis accelerometer and a wearable camera
Physical Activity Recognition using Multiple Sensors Embedded in a Wearable Device YUNYOUNG NAM, Ajou University, Suwon, South Korea SEUNGMIN RHO and CHULUNG LEE, Korea University In this article, we present a wearable intelligence device for activity monitoring applications. We developed and evaluated algorithms to recognize physical activities from data acquired using a 3-axis accelerometer with a single camera worn on a body. The recognition process is performed in two steps: at first the features for defining a human activity are measured by the 3-axis accelerometer sensor and the image sensor embedded in a wearable device. Then, the physical activity corresponding to the measured features is determined by applying the SVM classifier. The 3-axis accelerometer sensor computes the correlation between axes and the magnitude of the FFT for other features of an activity. Acceleration data is classified into nine activity labels. Through the image sensor, multiple optical flow vectors computed on each grid image patch are extracted as features for defining an activity. In the experiments, we showed that an overall accuracy rate of activity recognition based our method was 92.78%. Categories and Subject Descriptors: G.3 [Probability and Statistics]: reliability and life testing; J.3 [Life and Medical Sciences]: health
ACM Transactions on Embedded Computing Systems (TECS) – Association for Computing Machinery
Published: Feb 1, 2013
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.