Access the full text.
Sign up today, get DeepDyve free for 14 days.
J. Susskind, G. Littlewort, M. Bartlett, J. Movellan, A. Anderson (2007)
Human and computer recognition of facial expressions of emotionNeuropsychologia, 45
Kuan-Yu Lin, Seraphina Yong, Shuo-Ping Wang, Chien-Tung Lai, Hao-Chuan Wang (2016)
HandVis: Visualized Gesture Support for Remote Cross-Lingual CommunicationProceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems
Lam Phi, Hung Nguyen, T. Bui, T. Vu (2015)
A glove-based gesture recognition system for Vietnamese sign language2015 15th International Conference on Control, Automation and Systems (ICCAS)
S. Mitra, T. Acharya (2007)
Gesture Recognition: A SurveyIEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 37
David Kim, Otmar Hilliges, S. Izadi, Alex Butler, Jiawen Chen, I. Oikonomidis, P. Olivier (2012)
Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensorProceedings of the 25th annual ACM symposium on User interface software and technology
Yannick Bernaerts, Matthias Druwé, Sebastiaan Steensels, Jo Vermeulen, Johannes Schöning (2014)
The office smartwatch: development and design of a smartwatch app to digitally augment interactions in an office environment
Aya Onishi, Chiemi Watanabe (2011)
Event Detection using Archived Smart House Sensor Data obtained using Symbolic Aggregate Approximation
Jessica Lin, Eamonn Keogh, Li Wei, S. Lonardi (2007)
Experiencing SAX: a novel symbolic representation of time seriesData Mining and Knowledge Discovery, 15
Y. Rabhi, M. Mrabet, F. Fnaiech (2018)
Intelligent Control Wheelchair Using a New Visual JoystickJournal of Healthcare Engineering, 2018
Xin Dang, Bingbing Kang, Xu-Yang Liu, Guangyu Cui (2017)
An Interactive Care System Based on a Depth Image and EEG for Aged Patients with DementiaJournal of Healthcare Engineering, 2017
A. Queirós, A. Silva, Joaquim Alvarelhão, N. Rocha, António Teixeira (2013)
Usability, accessibility and ambient-assisted living: a systematic literature reviewUniversal Access in the Information Society, 14
Enrique Garcia-Ceja, R. Brena, J. Carrasco-Jiménez, Leonardo Garrido (2014)
Long-Term Activity Recognition from Wristwatch Accelerometer DataSensors (Basel, Switzerland), 14
Wonkyu Park, S. Han (2014)
An Analytical Approach to Creating Multitouch Gesture Vocabularies in Mobile Devices: A Case Study for Mobile Web Browsing GesturesInternational Journal of Human–Computer Interaction, 30
M. Shoaib, S. Bosch, H. Scholten, P. Havinga, Özlem Incel (2015)
Towards detection of bad habits by fusing smartphone and smartwatch sensors2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops)
L. Zhong J. Liu (2009)
uWave: accelerometer-based personalized gesture recognition and its applications,Pervasive and Mobile Computing, J. Liu, L. Zhong, J. Wickramasuriya, and V. Vasudevan, “uWave: accelerometer-based personalized gesture recognition and its applications,” Pervasive and Mobile Computing, vol. 5, no. 6, pp. 657–675, 2009. View at Publisher · View at Google Scholar · View at Scopus
Lisa Anthony, J. Wobbrock (2010)
A lightweight multistroke recognizer for user interface prototypes
Juan Colomer, D. Salvi, M. Cabrera-Umpiérrez, M. Arredondo, Patricia Abril-Jiménez, Viveca Jimenez-Mixco, R. García-Betances, A. Fioravanti, M. Pastorino, J. Cancela, Alejandro Gil (2014)
Experience in Evaluating AAL Solutions in Living LabsSensors (Basel, Switzerland), 14
Andreas Holzinger, Gig Searle, Michaela Wernbacher (2011)
The effect of previous exposure to technology on acceptance and its importance in usability and accessibility engineeringUniversal Access in the Information Society, 10
D. Salvi, Juan Colomer, M. Arredondo, Barbara Prazak-Aram, Christopher Mayer (2015)
A framework for evaluating Ambient Assisted Living technologies and the experience of the universAAL projectJ. Ambient Intell. Smart Environ., 7
A. Clark, Deshendran Moodley (2016)
A System for a Hand Gesture-Manipulated Virtual Reality Environment
Romina Fernandez, C. Lücken (2015)
Using the Kinect sensor with open source tools for the development of educational games for kids in pre-school age2015 Latin American Computing Conference (CLEI)
Y. Li (2010)
Protractor: a fast and accurate gesture recognizerProceedings of the SIGCHI Conference on Human Factors in Computing Systems
Johanna Höysniemi, Perttu Hämäläinen, Laura Turkki, Teppo Rouvi (2005)
Children's intuitive gestures in vision-based action gamesCommun. ACM, 48
B. Mourad, A. Tarik, Afdel Karim, E. Pascal (2015)
System Interactive Cyber Presence for E learning to Break Down Learner IsolationInternational Journal of Computer Applications, 111
C. Rishikanth, H. Sekar, Gautham Rajagopal, R. Rajesh, Vineeth Vijayaraghavan (2014)
Low-cost intelligent gesture recognition engine for audio-vocally impaired individualsIEEE Global Humanitarian Technology Conference (GHTC 2014)
L. Dipietro, A. Sabatini, P. Dario (2008)
A Survey of Glove-Based Systems and Their ApplicationsIEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 38
Jessica Lin, Eamonn Keogh, S. Lonardi, B. Chiu (2003)
A symbolic representation of time series, with implications for streaming algorithms
Lisa Anthony, J. Wobbrock (2012)
$N-protractor: a fast and accurate multistroke recognizer
Ralph Mbouna, S. Kong, Myung-Geun Chun (2013)
Visual Analysis of Eye State and Head Pose for Driver Alertness MonitoringIEEE Transactions on Intelligent Transportation Systems, 14
Radu-Daniel Vatavu, Lisa Anthony, J. Wobbrock (2012)
Gestures as point clouds: a $P recognizer for user interface prototypes
Mathias Wilhelm, Daniel Krakowczyk, Frank Trollmann, S. Albayrak (2015)
eRing: multiple finger gesture recognition with one ring using an electric fieldProceedings of the 2nd international Workshop on Sensor-based Activity Recognition and Interaction
Lei Jing, Zixue Cheng, Yinghui Zhou, Junbo Wang, Tongjun Huang (2013)
Magic Ring: a self-contained gesture input device on fingerProceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia
Sven Kratz, M. Rohs (2011)
Protractor3D: a closed-form solution to rotation-invariant 3D gestures
Ilias Maglogiannis, C. Ioannou, P. Tsanakas (2016)
Fall detection and activity identification using wearable and hand-held devicesIntegr. Comput. Aided Eng., 23
Xu Zhang, Xiang Chen, Yun Li, V. Lantz, Kongqiao Wang, Jihai Yang (2011)
A Framework for Hand Gesture Recognition Based on Accelerometer and EMG SensorsIEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 41
Sven Kratz, M. Rohs (2010)
A $3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors
J. Wobbrock, Andrew Wilson, Yang Li (2007)
Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypesProceedings of the 20th annual ACM symposium on User interface software and technology
Sen Zhang, M. Ang, Wendong Xiao, C. Tham (2009)
Detection of Activities by Wireless Sensors for Daily Life Surveillance: Eating and DrinkingSensors (Basel, Switzerland), 9
H. Sekar C. Rishikanth
Low-cost intelligent gesture recognition engine for audio-vocally impaired individuals,Proceedings of the 2014 IEEE Global Humanitarian Technology Conference (GHTC), C. Rishikanth, H. Sekar, G. Rajagopal, R. Rajesh, and V. Vijayaraghavan, “Low-cost intelligent gesture recognition engine for audio-vocally impaired individuals,” in Proceedings of the 2014 IEEE Global Humanitarian Technology Conference (GHTC), pp. 628–634, IEEE, San Jose, CA, USA, October 2014. View at Publisher · View at Google Scholar · View at Scopus
C. Dodd, R. Athauda, M. Adam (2017)
Designing User Interfaces for the Elderly: A Systematic Literature Review
L. Baraldi, F. Paci, G. Serra, L. Benini, R. Cucchiara (2015)
Gesture Recognition Using Wearable Vision Sensors to Enhance Visitors’ Museum ExperiencesIEEE Sensors Journal, 15
J. Brooke (1996)
SUS: A 'Quick and Dirty' Usability Scale
Jiayang Liu, Lin Zhong, Jehan Wickramasuriya, V. Vasudevan (2009)
uWave: Accelerometer-based personalized gesture recognition and its applications2009 IEEE International Conference on Pervasive Computing and Communications
Hindawi Journal of Healthcare Engineering Volume 2018, Article ID 3180652, 12 pages https://doi.org/10.1155/2018/3180652 Research Article An Easily Customized Gesture Recognizer for Assisted Living Using Commodity Mobile Devices Antigoni Mezari and Ilias Maglogiannis Department of Digital Systems, University of Piraeus, Piraeus, Greece Correspondence should be addressed to Ilias Maglogiannis; imaglo@gmail.com Received 31 March 2018; Revised 12 June 2018; Accepted 28 June 2018; Published 19 July 2018 Academic Editor: Jesus Fontecha Copyright © 2018 Antigoni Mezari and Ilias Maglogiannis. 'is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Automatic gesture recognition is an important field in the area of human-computer interaction. Until recently, the main approach to gesture recognition was based mainly on real time video processing. 'e objective of this work is to propose the utilization of commodity smartwatches for such purpose. Smartwatches embed accelerometer sensors, and they are endowed with wireless communication capabilities (primarily Bluetooth), so as to connect with mobile phones on which gesture recognition algorithms may be executed. 'e algorithmic approach proposed in this paper accepts as the input readings from the smartwatch accel- erometer sensors and processes them on the mobile phone. As a case study, the gesture recognition application was developed for Android devices and the Pebble smartwatch. 'is application allows the user to define the set of gestures and to train the system to recognize them. 'ree alternative methodologies were implemented and evaluated using a set of six 3-D natural gestures. All the reported results are quite satisfactory, while the method based on SAX (Symbolic Aggregate approXimation) was proven the most efficient. track the motion. Recognition based on the visual channel is 1. Introduction currently the most widespread method of recognizing Gesture recognition refers to recognizing meaningful body gestures. 'e visual recording devices are usually installed at motions involving movements of the fingers, hands, arms, a fixed location and the gesture recognition is restricted in head, face, or body performed with the intent to convey confined space. Wearable devices used for visual recognition meaningful information or to interact with the environment include glasses camera [13] and wrist-worn device with [1]. 'is topic is considered extremely important in appli- infrared spectral camera (IR) [14]. cations based on smart and efficient human computer in- Recognizing the motion of the fingers is a special topic in terfaces. More specifically, gesture recognition applies to gesture recognition. It is used in sign language [4, 5] as well several computer applications, such as those involving as in virtual reality and robotics. Sensor devices such as young children interaction [2, 3], sign language recognition gloves with sensors [15, 16] and electromyogram sensors [4, 5], monitoring of physical activity or events involving (EMG) [4] are also used to capture finger movements. disabled persons or the elderly [6, 7], the medical monitoring A sensor ring is another wearable device that has been of the emotional state or level of stress [8], the navigation or proposed for recognizing finger gestures [17, 18]. 'e cre- manipulation in virtual environments [9], the communi- ation of gesture vocabularies to manipulate devices is also an cating in a teleconference [10], the distance learning interesting topic. Park and Han [19] propose an analytical [11], and the monitoring of driver alertness/drowsiness approach to the creation of multitouch control-gesture status [12]. vocabularies applicable to mobile devices. 'e implementation of computerized gesture recogni- A smartwatch equipped with accelerometer can provide tion requires the use of various visual or sensor devices to information about the movement of the hand and may be 2 Journal of Healthcare Engineering Table 1: Summary of gesture recognizers. used for recognizing gestures. 'e main advantage of using a smartwatch is that it does not impose restrictions to the Gesture recognizer Data dimensions Scaling Metric user. In this context, its use does not impose any restriction $1 2D Yes Euclidean in space, and the user is not forced to use a special-purpose Protractor 2D No Cosine sensor device, which would probably cause him some kind of $3 3D Yes Euclidean discomfort. Protractor3D 3D Yes Euclidean A number of prototype (proof-of-concept) gesture $N 2D Yes Euclidean recognition applications based on smartwatches or wrist- $N-protractor 2D Yes Cosine worn devices may be found in the literature. For instance, $P 2D Yes Other uWave 3D No DTW Bernaerts et al. in [20] propose a smartwatch application to allow a person to physically and virtually lock and unlock doors, to acquire room information, and to send virtual knocks. Zhang et al. [21] use wearable accelerometer sensors recognizer to manipulate 3-D data. Kratz and Rohs [27] attached to both wrists to detect eating and drinking ac- improved $3 recognizer by computing the fine-tune rotation tivities. Shoaib et al. [22] use two mobile phones to recognize angle using a method similar to the one used by Protractor. the user’s activity. One of the phones is attached on the user 'ey named the new recognizer Protractor3D. $N recog- wrist to simulate a smartwatch. 'e accelerometer and nizer [28] is built upon the $1 and is intended for the gyroscope measurements are utilized for the recognition of recognition of multistroke 2-D gestures. $N-protractor [29] 13 activities. Garcia-Ceja et al. [23] use acceleration data is a variation of $N that embeds the technique of Protractor. from a wristwatch in order to identify long-term activities. $P [30] belongs to the same recognizer family with $1 and 'e results may be used as an indicator of how independent $N. $P does not represent gestures as ordered series of points a person is and as a source of information to healthcare but as unordered point-clouds. Another gesture recognition intervention applications. method is uWave [31]. It uses the data of a three-axis ac- 'e aim of this work is to examine whether simple and celerometer. 'e time series of the accelerometer data is compressed by an averaging window and the new values are natural gestures can be reliably recognized using commodity devices like a smartwatch and a smartphone and propose nonlinearly quantized. uWave employs dynamic time specific methodologies to improve performance and accu- warping (DTW) to match two time series and the Euclidean racy. 'e rest of the paper is structured as following: in distance as the distance function. uWave adapts its templates Section 2, we provide background information on gesture to deal with the gesture variations through the time. Xie and recognition methods and we present the Pebble smartwatch Pan [32] aim to improve the accuracy of gesture recognition. used in this work. In Section 3, the proposed methodology is 'ey employ a low-pass filter to smooth the data and use described for developing the gesture recognition system, dynamic-threshold truncation to remove data recorded while in Section 4 use cases and the system evaluation with before the gesture actually starts and after the gesture ac- corresponding experimental results are presented. Finally, tually ends. To the produced time series, they append the Section 5 presents future plans and concludes the paper. amplitudes of its fast Fourier transform (the 21 first values). Table 1 summarizes the most prominent gesture recognizers. In this work, we propose the utilization of a commodity 2. Related Work and Background Information smartwatch as the motion sensor. Smartwatches are Several techniques for gesture recognition using motion equipped with a microdisplay, integrated sensors, and network connectivity. 'e Pebble smartwatch features sensors exist in the literature. Wobbrock et al. [24] de- veloped $1 recognizer, a 2-D unistroke recognizer designed a 3-axis accelerometer that produces integer data measured for rapid prototyping of gesture-based user interfaces. 'ey in milli-Gs. It is calibrated to measure a maximum accel- proposed a 4-step processing of the recorded path of the eration of ±4G. Accelerometer data can be received in gesture: (i) resample into a fixed number of points evenly batches, to save CPU time and battery life. 'e programmer spaced along the path, (ii) rotate the path so that the line can set the accelerometer sampling rate in one of the four from the centroid of the path to the first point of the path to valid values (10Hz, 25Hz, 50Hz, and 100Hz) and also the number of samples per batch. 'e communication with an be parallel to thex-axis, (iii) scale the path (nonuniformly) to a reference square, and (iv) move the path so that its centroid Android or iOs device is implemented using Bluetooth 4.0 (Bluetooth low energy) protocol. 'e programmer can re- is at (0,0). To compare the gesture with the templates, the mean value of the Euclidean distance between the corre- duce the sniff interval, the period during which the Blue- sponding points is computed. 'e authors propose the use of tooth module may not exchange (ACL) packets, if an app an iterative method to fine-tune the rotation angle. 'e requires reduced latency when sending messages. An open gesture is recognized as the template having the minimum software development kit (SDK) is available to pro- distance. Protractor [25] is another recognizer quite similar grammers. 'e SDK, referred to as PebbleKit, is available for to $1. 'e gesture is not scaled in this case, and the fine-tune smartphones running iOs or Android and allows the two- rotation angle is calculated so that the cosine distance be- way communication between the Pebble watch and the tween the two gestures minimizes. Protractor uses the in- smartphone. Adding to the above characteristics that the Pebble watch is available at low cost, less than 100€, it is best verse cosine distance as the similarity score. Kratz and Rohs [26] proposed $3 recognizer, a modification of the $1 suited for this research effort. Journal of Healthcare Engineering 3 hh hu (a) (b) hud hh2 (c) (d) ud hu2 (e) (f) Figure 1: e selected gestures; a dot depicts a gesture start. recognition and if possible contributes to their dierentia- tion, and they can be related to commands for the ma- nipulation of devices. e selected set is shown in Figure 1. e gesture “hh2” is twice the gesture “hh”, and the gesture “hu2” is twice the gesture “hu.” Nevertheless, the proposed system is �exible, and it can be trained to any kind of gesture dataset. e three axes along which acceleration is measured are bound to the watch. e accelerometer data received during a gesture include gravity and are also aected by the change of the orientation of the smartwatch during the movement. Data also contain noise. e use of tap event at –5 the beginning and the end of the gesture aects the ac- celerometer data measurements at those points. Since the –10 duration of a gesture varies and the accelerometer collects data at a constant rate, the number of samples for each –15 0 50 100 150 gesture diers. Figure 2 illustrates the raw accelerometer Sample measurements along the x-axis of the smartwatch. e €ve curves in the €gure correspond to 5 repetitions of the same Figure 2: Raw measurements of a gesture performed by the same gesture performed by one user. A heuristic algorithm to user 5 times. eliminate the eect of the tap event was used. Starting from the thirtieth measurement from the start and till the 3. Gesture Recognition Methodology thirtieth measurement to the end, the maximum dier- 3.1. Selected Gestures, Measurements, and Recognition ence between two successive measurements was de- Methods. e gestures used for the evaluation of the system termined, in order to be used as a threshold. Starting from were selected according to the following criteria: they are the thirtieth measurement from the start towards the €rst characterized by the wrist movement, they are simple and measurement, the dierence between two successive natural gestures and they are easily repeated, they are dif- measurements is calculated and if it exceeds the threshold ferent from each other, the gravity does not complicate their value, the part of the gesture before that point is excluded. Acceleration 4 Journal of Healthcare Engineering 7 10 0 0 0 5101520 0 5101520 Order number Order number hh hh hu hu hh2 hh2 (a) (b) 0 5101520 Order number hh hu hh2 (c) Figure 3: e FFT method: the €nal time series of 3 gestures performed by the same user 5 times each. (a) x-axis, (b) y-axis, and (c) z-axis. A similar procedure is applied to exclude a part of the time e distance between two €nal time series was computed as the mean Euclidean distance of their points. Figure 4 il- series near the end of it. In our work, we examined three alternative techniques for gesture recognition, which are lustrates the preprocessing procedures implemented by the discussed in the following subsections. FFT method and the other two employed methods. 3.2. e Fast Fourier Transformation (FFT) Method. FFT 3.3. e Geometric Method. In the geometric method, coe’cients of motion signals can be used for gesture rec- captured motion and data were transformed to new time ognition according to the literature [32]. In the proposed series by replacing a sample by the sum of its value and all the method, a simple low-pass €lter is initially applied as in [32]. previous values. As previously, in order to maintain for all e €lter is recursively de€ned as following: the time series the same length, data were resampled using linear interpolation. As a next step, the series was scaled to €t s a · x +(1 − a) · s , (1) t t−1 t−1 in a normalized cube of edge of 100 units as in [26], which where x is the t order point of the time series and s the means that after the scaling the maximum value minus the t t computed value at the t order point. For the constant α the minimum value of samples at each axis was equal to 100. value 0.3 was selected with trial and error during evaluation. Figure 5 illustrates the €nal time series of 3 gestures per- In order that all-time series have the same length, data were formed by the same user 5 times each. e distance between resampled to 512 points using linear interpolation. We set two €nal time series was computed as the mean Euclidean the number of points to 512 so that it is a power of 2, as distance of their points. needed by the employed FFT implementation, and so that the original time series were not subsampled (given that these, in our datasets, had up to approximately 450 points). 3.4. e SAX Method. SAX (Symbolic Aggregate approXi- FFT was applied to the new time series, and the 21 €rst mation) [33–35] transforms a time series of arbitrary length coe’cients for each axis were kept. e latter value was n to a string of arbitrary length w using an alphabet of chosen according to the €ndings of [32] and our preliminary arbitrary size a. e time series is initially normalized to experiments. Figure 3 illustrates the €nal time series of have a mean of zero and a standard deviation of one. 3 gestures performed by the same user 5 times each. Piecewise aggregate approximation is used to reduce the Amplitude Amplitude Amplitude Journal of Healthcare Engineering 5 Time series of accelerometer measurements Remove tap effect FFT Geometric SAX New time series produced by Normalize to mean of zero and Apply a low-pass filter summation standard deviation of one Resample to a selected Resample to a selected Apply PAA to produce a new number of samples number of samples time series of a selected length Apply FFT and keep the Scale to fit a cube of Discretize using a selected first N coefficients predefined size alphabet Figure 4: e preprocessing steps of the 3 alternative proposed methods. 150 120 –50 –100 –20 0 50 100 150 200 250 0 50 100 150 200 250 Sample Sample hh hh hu hu hu2 hu2 (a) (b) –20 –40 –60 –80 –100 –120 0 50 100 150 200 250 Sample hh hu hu2 (c) Figure 5: e geometric method: the €nal time series of 3 gestures performed by the same user 5 times each. dimensionality from n to w. e “breakpoints” to be used for belong to a division with the same symbol of the alphabet. the discretization of the transformed time series are selected e distance between two symbols is de€ned to be zero if so that they divide the Gaussian distribution to equal they dier at most one step. If they dier more steps, the probability areas. e time series is discretized to the selected distance is computed as the dierence between the lower alphabet by dividing the value space according to the de- limit of the largest symbol minus the upper limit of the termined “breakpoints” and replacing all the values that smallest symbol. Value Value Value 6 Journal of Healthcare Engineering 6 6 4 4 2 2 0 20 40 60 80 0 20 40 60 80 Sample Sample (a) (b) Figure 6: e SAX method: the €nal strings of 2 gestures, the hh gesture (a) and the hu gesture (b), performed by the same user 5 times each. Android 4. The Proposed Gesture Recognition System e implemented system consists of two companion ap- Pebble plications. e €rst application runs on the smartwatch (Pebble) and is responsible for capturing the accelerometer measurements and sending them to the Android applica- tion. e second application runs on the companion An- droid device. Android application provides the interface to the user, receives the motion data from the smartwatch, updates the database (template library) with the training gestures, and runs the recognition methods. Pebble and the Application Android device communicate with each other using Blue- Application tooth. Figure 7 illustrates the basic system architecture and Accelerometer the resources used. Figure 8 illustrates the communication Screen Database between the Pebble and the Android modules. Bluetooth Touch screen Bluetooth 4.1. e Pebble Application. A signi€cant design choice for the smartwatch application is the initiation. In our system, Figure 7: System architecture. the tap event was selected to delimit the gesture so that the user can start and stop the gesture in a spontaneous and natural way. An abrupt movement of the wrist is perceived In order to apply the SAX method, we initially com- as tap event by Pebble so the user can start and stop the bined the data of the three axes in one time series, €rst all gesture using only the hand the smartwatch is worn on. the x-axis data followed by all the y-axis data and the z-axis Another important aspect is energy consumption. In order data. We then normalized each time series to have a mean to save energy, the Pebble application is not constantly in of zero and a standard deviation of one, thus jointly operation but it is activated by the Android application. For normalizing the x, y, and z-axis data. e parameters of the the same reason accelerometer data are sent to Pebble ap- SAX method we used were w 96 and a 7, which means plication in batches. As soon as the Pebble application is we selected to represent a gesture with a string of 96 activated, a message is displayed on its screen to inform the symbols (32 for each axis) using an alphabet of 7 symbols. user for the Android application operation mode (training e parameters were chosen so as to preserve the char- or recognizing). e Pebble application is waiting for a tap acteristics of the time series of the selected gestures. We event after which it repeatedly sends accelerometer data to transformed the normalized time series into the piecewise the Android application and receives the corresponding aggregate approximation (PAA) representation and after acknowledgment. A new tap event signi€es the gesture end. that we symbolized the PAA representation into a discrete string. Figure 6(a) illustrates one gesture performed by the same user 5 times, and Figure 6(b) illustrates another 4.2. e Android Application. e Android application gesture. e distance between two strings was computed as features a menu with commands where the user can manage the mean distance of their symbols according to the SAX the gesture library. e user can see how many templates are method. stored in the library and manage them (i.e., delete a template e complexity of the preprocessing algorithm is O(n), or clear the library). Additional commands refer to training where n is the length of the time series of the gesture (i.e., set the application in training mode) and recognizing measurements. e complexity of the recognition algorithm (i.e., set the application in recognizing mode and let the user is O(n), where n is the parameter w of the SAX method. to key in the gesture name). e user can start the Pebble Symbol Symbol Training Clear Journal of Healthcare Engineering 7 Pebble smart watch Android application Accelerometer Start/stop Gesture gesture Tap event preprocessing AppMessage service Collect data of Learn Recognize one gesture Data Dictionary batches Update Gesture library recognition Figure 8: e communication between the Pebble and the Android modules. Start Α Menu Gesture Measurements Library Training Recognizing End from Pebble Input gesture Set mode to Gesture List library name recognizing processing Recognizing Compare to Set mode to No templates training Report Α results Update Clear library Α library Figure 9: e Android application structure. application, using a button, in order to receive accelerometer the previous channel. Additional commands may be cor- measurements. In recognizing mode, after the end of the related for controlling with gestures wheelchairs [36] and gesture, the application answers with the best match ob- robotic platforms intended for assisted living. tained by each recognition method. Figure 9 illustrates the Android application structure. 5.2. Evaluation. In order to evaluate the application and the utilized methods, we asked four persons to perform the 5. The System in Practice—Use Case and gestures illustrated in Figure 1. Each set included at least 5 Experimental Results repetitions of the same gesture. A few days later, the same persons performed additional gesture sets. Finally, we col- 5.1. Use Case. Gestures may be considered as a means of lected 8 sets with a total of 304 gestures. Two types of multimodal interaction. An older person may have health evaluation were performed; one involving training by the problems pertaining to his or her €ngers. Using the small same person and a second utilizing training by dierent users, buttons of a remote controller is an issue and an alternative in order to assess the robustness of the proposed methods. way to control the television, the air conditioner, the window blinds, and so on is required. A set of gestures that takes into consideration the mobility decline is the perfect solution. As 5.2.1. User-Dependent Recognition (Personalized Gesture an example, the older person can use “hu” gesture to change Recognition). In this case, the €rst gesture set of user A was the tv program to the next channel and “hh” to change it to used to train the system and the second set of the same user 8 Journal of Healthcare Engineering Table 2: Results of personalized gesture recognition. Geometric FFT SAX Library-recognized Correct False % Correct False % Correct False % A1-A2 30 6 83 32 4 89 35 1 97 A2-A1 29 1 97 29 1 97 30 0 100 B1-B2 47 3 94 44 6 88 50 0 100 B2-B1 29 1 97 29 1 97 30 0 100 C1-C2 33 3 92 36 0 100 34 2 94 C2-C1 29 6 83 34 1 97 35 0 100 D1-D2 50 0 100 49 1 98 50 0 100 D2-D1 37 0 100 36 1 97 37 0 100 Totals 284 20 93 289 15 95 301 3 99 Table 3: Geometric method: confusion matrix. Table 6: User independent recognition. Geometric hh hu hud ud hh2 hu2 Method Correct False % Min % hh 49 1 Geometric 636 123 84 69 hu 43 4 3 FFT 542 217 71 42 hud 1 46 5 1 SAX 732 27 96 86 ud 2 42 hh2 52 1 hu2 1 1 46 5.2.2.UserIndependentRecognition. In order to evaluate the system in the case of the user independent recognition the Table 4: FFT method: confusion matrix. following process was applied: the first set of user A was used to train the system. All the gestures of the other users (B, C, FFT hh hu hud ud hh2 hu2 and D) were used as the gestures to be recognized. 'e same hh 47 1 6 process was repeated three times using the first set for the hu 43 1 other users (B, C, and D) as the training set. 'e results are hud 2 52 3 summarized in Table 6. 'e geometric method produced ud 44 hh2 2 44 correct results in the 84% of the occasions in average with the hu2 51 lowest success rate to be 69%. 'e FFT method produced correct answer in the 71% of the occasions in average with the lowest success rate to be 42%. 'e SAX method produced correct answer in the 95% of the experiments in average with Table 5: SAX method: confusion matrix. the lowest success rate to be 86%. SAX outperforms in the SAX hh hu hud ud hh2 hu2 user independent recognition case as well achieving quite satisfactory accuracy. 'us, it can be assumed that the SAX hh 49 hu 43 method can be utilized in user independent recognition in hud 2 52 real applications. ud 1 48 hh2 52 hu2 51 5.3. Battery Consumption. 'e gesture duration or equiva- lently the time series length varies considerably depending on gesture type and user. 'is has the effect that Pebble battery consumption caused by the collection and trans- for testing. 'is process was repeated using the second set to mission of the motion data will be different depending on train the system and the first for testing. 'e above- gesture type and user. Pebble reports battery drop in steps of mentioned process was repeated for all users, and the av- 10%. In order to estimate how the use of the gesture rec- erage values are reported in Table 2 for the three recognition ognition application affects the Pebble autonomy we asked methods. More specifically, the geometric method produced two of the users to perform repeatedly the gestures of correct results in the 93% of the experiments with the success Figure 1 until the battery indicator dropped by 10%. A drop rate ranging from 83% to 100%. 'e FFT method produced from 100% to 80% battery resulted after 780 gestures. Taking correct answer in the 95% of the tests with the success rate into consideration that the Pebble battery duration for ranging from 88% to 100%, while the SAX method out- normal use is 7 days, the expected battery life D, measured in performs producing correct response in the 99% of the days, is given by the expression (2). In this expression, w occasions with the success rate ranging from 94% to 100%. stands for the average fraction of the battery consumed per Tables 3–5 present the confusion matrices of the methods. gesture and N stands for the number of gestures performed On the first row, there are the names of the gestures the user per day. performed and on the first column the system response. Journal of Healthcare Engineering 9 7 and adapted. In this context, we modi€ed the SUS state- ments to clearly describe the system we evaluate. More speci€cally, SUS is a Likert scale. It is composed of 6.8 ten statements, €ve positive statements and €ve negative statements which alternate. e respondent indicates the degree of disagreement or agreement with the statement on a 5-point scale varying from 1, strongly disagree, to 5, 6.6 strongly agree. For statements 1, 3, 5, 7, and 9 that are positively worded statements the score is calculated by the 6.4 scale position minus 1. For statements 2, 4, 6, 8, and 10 that are negatively worded the score is 5 minus the scale position. Each statement’s score ranges from 0 to 4. e sum of the 6.2 scores of the ten statements is multiplied by 2.5 to produce the overall value of SUS on the scale 0 to 100. e level of education and the previous exposure to technology as proposed by Holzinger et al. [41] were not included in the 10 20 30 40 50 60 70 80 questionnaire, since the majority of the participants share Gestures per day the same characteristics. However, we intend to include Figure 10: Estimated battery life. Horizontal axis is the number of them in a future more thorough evaluation of the proposed detected gestures per day. Vertical axis shows the eect on battery application. life. e results of the evaluation are depicted in Table 7, which presents the statements used and the mean values of the collected answers and scores. e total result of 78/100 is D . (2) quite encouraging and it demonstrates the potential of the (1/7)+ w · N proposed system. In the participants’ responses, no gender- related dierences were observed. ese two parameters e estimated battery duration is illustrated in Figure 10. must also be investigated for the proposed system. e autonomy remains above 6 days even under the high use of 80 gestures per day. 6. Conclusions 5.4. Usability Assessment. e challenges that elderly users experience with user interfaces are related to physical issues, In this work, we examined the feasibility of a gestured technology experience, and cognitive issues [37]. For ex- recognition system based on a smartwatch and a connected ample, deteriorating physical condition results in impaired smartphone. e motion measurements obtained by the eyesight, haptic deterioration, or reduced hearing. Designing accelerometer of the smartwatch during a gesture were for the elderly, we have to take into consideration the speci€c captured and relayed to the smartphone. ree alternative challenges they face. Although the main focus of this work is recognition methods were implemented. e SAX method on technical innovation and not on usability assessment, we outperformed the rest and produced high accurate results evaluated the implemented system from this perspective as even in the case of user independent training. well. ere are many assessment techniques to evaluate Although the gestures used to evaluate our methods were Ambient Assisted Living (AAL) technologies. Salvi et al. [38] chosen due to their simplicity, the application we developed propose a methodological framework for evaluating AAL can be utilized with dierent gesture sets. e critical solutions. ey present both subjective and objective eval- condition for a gesture set to be suitable is that the gestures uation techniques. Colomer et al. [39] present their expe- must be quite dierent from each other in accordance with rience in evaluating AAL solutions in the Smart House the combination of the orientation of the watch and the Living Lab. ey describe the evaluation methodologies used direction of its movement. Gestures following similar trace in their projects and propose a set of guidelines to conduct with dierent orientation of the watch are considered to be evaluations. Queiros ´ et al. [40] present a systematic literature dierent gestures. In addition the accelerometer measure- review of AAL technologies, products, and services re- ments contain gravity, and a suitable set of gestures must garding usability and accessibility. Finally, Holzinger et al. exploit the contribution of the gravity to the diversi€cation [41] concluded in their research that the level of education of the gestures. e reported results are quite encouraging and the previous exposure to technology are relevant to the and prove the e’ciency of a plain smartwatch to detect acceptability of new technology by the elderly. simple gestures. In this work, in order to investigate the acceptance and ere are several interesting issues related to the im- usability of the proposed system, we collected the opinion of provement of the result reliability to achieve the goal of 20 users, 10 males and 10 females, of age 63 to 72. We used absolutely reliable recognition. If the gesture set is known in questionnaires based on a modi€ed System Usability Scale advance, the recognition method can be adapted to make use (SUS) [42]. SUS is a mature questionnaire, it is considered to of those characteristics of the gestures that dierentiate from be very robust and �exible, and it has been extensively used each other. Battery duration (days) 10 Journal of Healthcare Engineering Table 7: Questionnaires’ results. Modified SUS statements Answer Score I think that I would like to use a smartwatch to 1. 3,7 2,7 control devices frequently I found the system of the smartwatch and the 2. 1,6 3,4 smartphone to control devices unnecessarily complex I thought that performing gestures to control devices 3. 3,8 2,8 was easy I think that I would need the support of a technical 4. 1,3 3,7 person to be able to use this system I found the various functions in this system (i.e., the 5. training with user-selected gestures and the 4,1 3,1 recognition of gestures) were well integrated I thought there was too much inconsistency in this 6. 1,8 3,2 system and I expect it to fail I would imagine that most people would learn to use 7. 4,1 3,1 such a system very quickly I found that the system of performing gestures to 8. 1,7 3,3 control devices was very cumbersome to use I feel very confident that I will be able to use the 9. 3,9 2,9 system I think I need to learn a lot of things before I can use 10. 2,0 3,0 this system Total score — 31,2 SUS score — 78,0 An extension of the training process could be the choice References of the most successful gesture recognition method between [1] S. Mitra and T. Acharya, “Gesture recognition: a survey,” those implemented by the application. During an initial IEEE Transactions on Systems, Man, and Cybernetics, training phase, more than one recognition method will be PartC(ApplicationsandReviews), vol. 37, no. 3, pp. 311–324, used, and the user will confirm or reject the application answers. Depending on the correct results, the most suc- [2] J. Hoysniemi, ¨ P. Ham ¨ al¨ ainen, ¨ L. Turkki, and T. Rouvi, cessful method will be selected by the application for future “Children’s intuitive gestures in vision-based action games,” use. Communications of the ACM, vol. 48, no. 1, pp. 44–50, 2005. 'e way a user makes a gesture may change over the [3] R. Fernandez ´ and C. von Lucken, ¨ “Using the Kinect sensor time. An open issue is whether a dynamic update of the with open source tools for the development of educational library could be applied. games for kids in pre-school age,” in Proceedings of the 2015 Latin American Computing Conference (CLEI), pp. 1–12, 'e development of a universal vocabulary, which is an IEEE, Arequipa, Peru, October 2015. intermediate level between the gesture recognition appli- [4] X. Zhang, X. Chen, Y. Li, V. Lantz, K. Wang, and J. Yang, “A cation and the device control, is also of interest. Some framework for hand gesture recognition based on acceler- commands like increasing, reducing, continuing, stop, redo, ometer and EMG sensors,” IEEE Transactions on Systems, and so on could compose a standard vocabulary for handling Man, and Cybernetics-Part A: Systems and Humans, vol. 41, everyday devices. A user could have his own set of gestures to no. 6, pp. 1064–1076, 2011. implement the commands that are included in the [5] L. T. Phi, H. D. Nguyen, T. Q. Bui, and T. T. Vu, “A glove- vocabulary. based gesture recognition system for Vietnamese sign lan- In a future work, we intend to incorporate the above- guage,” in Proceedings of the 2015 15th International mentioned ideas to create an application to facilitate people Conference on Control, Automation and Systems (ICCAS), with disabilities in daily life activities. pp.1555–1559, IEEE, Busan, Republic of Korea, October 2015. [6] I. Maglogiannis, C. Ioannou, and P. Tsanakas, “Fall detection and activity identification using wearable and hand-held Data Availability devices,” Integrated Computer-Aided Engineering, vol. 23, no. 2, pp. 161–172, 2016. 'e dataset used in this research is available to the research [7] X. Dang, B. Kang, X. Liu, and G. Cui, “An interactive care community upon e-mail request to the corresponding system based on a depth image and eeg for aged patients with author. dementia,” Journal of Healthcare Engineering, vol. 2017, Ar- ticle ID 4128183, 8 pages, 2017. Conflicts of Interest [8] J. M. Susskind, G. Littlewort, M. S. Bartlett, J. Movellan, and A. K. Anderson, “Human and computer recognition of facial 'e authors declare that there are no conflicts of interest expressions of emotion,” Neuropsychologia, vol. 45, no. 1, regarding the publication of this paper. pp. 152–162, 2007. Journal of Healthcare Engineering 11 [9] A. Clark and D. Moodley, “A system for a hand gesture- 2015 IEEE International Conference on Pervasive Computing manipulated virtual reality environment,” in Proceedings of and Communication Workshops (PerCom Workshops), the Annual Conference of the South African Institute of pp. 591–596, IEEE, Seattle, WA, USA, March 2015. [23] E. Garcia-Ceja, R. F. Brena, J. C. Carrasco-Jimenez, and Computer Scientists and Information Technologists, p. 10, ACM, Toronto, ON, Canada, September 2016. L. Garrido, “Long-term activity recognition from wristwatch [10] K. Y. Lin, S. Yong, S. P. Wang, C. T. Lai, and H. C. Wang, accelerometer data,”Sensors, vol.14, no.12, pp. 22500–22524, “HandVis: visualized gesture support for remote cross-lingual 2014. communication,” in Proceedings of the 2016 CHI Conference [24] J. O. Wobbrock, A. D. Wilson, and Y. Li, “Gestures without Extended Abstracts on Human Factors in Computing Systems, libraries, toolkits or training: a $1 recognizer for user interface pp. 1236–1242, ACM, San Jose, CA, USA, May 2016. prototypes,” in Proceedings of the 20th Annual ACM Sym- [11] B. Mourad, A. Tarik, A. Karim, and E. Pascal, “System in- posium on User Interface Software and Technology, pp. 159– teractive cyber presence for E-learning to break down learner 168, ACM, Newport, RI, USA, October 2007. isolation,” International Journal of Computer Applications, [25] Y. Li, “Protractor: a fast and accurate gesture recognizer,” in vol. 111, no. 16, pp. 35–40, 2015. Proceedings of the SIGCHI Conference on Human Factors in [12] R. O. Mbouna, S. G. Kong, and M. G. Chun, “Visual analysis Computing Systems, pp. 2169–2172, ACM, Atlanta, GA, USA, of eye state and head pose for driver alertness monitoring,” April 2010. [26] S. Kratz and M. Rohs, “A $3 gesture recognizer: simple gesture IEEE transactions on Intelligent Transportation Systems, vol. 14, no. 3, pp. 1462–1469, 2013. recognition for devices equipped with 3D acceleration sen- [13] L. Baraldi, F. Paci, G. Serra, L. Benini, and R. Cucchiara, sors,” in Proceedings of the 15th International Conference on “Gesture recognition using wearable vision sensors to en- Intelligent User Interfaces, pp. 341–344, ACM, Miami, FL, hance visitors’ museum experiences,” IEEE Sensors Journal, USA, February 2010. vol. 15, no. 5, pp. 2705–2714, 2015. [27] S. Kratz and M. Rohs, “Protractor3D: a closed-form solution [14] D. Kim, O. Hilliges, S. Izadi et al., “Digits: freehand 3D in- to rotation-invariant 3D gestures,” in Proceedings of the 16th teractions anywhere using a wrist-worn gloveless sensor,” in International Conference on Intelligent User Interfaces, Proceedings of the 25th Annual ACM Symposium on User pp. 371–374, ACM, New York, NY, USA, February 2011. Interface Software and Technology, pp. 167–176, ACM, New [28] L. Anthony and J. O. Wobbrock, “A lightweight multistroke York, NY, USA, October 2012. recognizer for user interface prototypes,” in Proceedings of [15] C. Rishikanth, H. Sekar, G. Rajagopal, R. Rajesh, and Graphics Interface 2010, pp. 245–252, Canadian Information V. Vijayaraghavan, “Low-cost intelligent gesture recognition Processing Society, Ottawa, ON, Canada, May 2010. [29] L. Anthony and J. O. Wobbrock, “$ N-protractor: a fast and engine for audio-vocally impaired individuals,” inProceedings of the 2014 IEEE Global Humanitarian Technology Conference accurate multistroke recognizer,” in Proceedings of Graphics (GHTC), pp. 628–634, IEEE, San Jose, CA, USA, October Interface2012, pp.117–120, Canadian Information Processing 2014. Society, Ottawa, ON, Canada, May 2012. [16] L. Dipietro, A. M. Sabatini, and P. Dario, “A survey of glove- [30] R. D. Vatavu, L. Anthony, and J. O. Wobbrock, “Gestures as based systems and their applications,” IEEE Transactions on point clouds: a $ P recognizer for user interface prototypes,” in Systems, Man, and Cybernetics, Part C (Applications and Proceedings of the 14th ACM International Conference on Reviews), vol. 38, no. 4, pp. 461–482, 2008. Multimodal Interaction, pp. 273–280, ACM, October 2012. [17] M. Wilhelm, D. Krakowczyk, F. Trollmann, and S. Albayrak, [31] J. Liu, L. Zhong, J. Wickramasuriya, and V. Vasudevan, “eRing: multiple finger gesture recognition with one ring “uWave: accelerometer-based personalized gesture recogni- using an electric field,” in Proceedings of the 2nd International tion and its applications,” Pervasive and Mobile Computing, vol. 5, no. 6, pp. 657–675, 2009. Workshop on Sensor-ased Activity Recognition and In- teraction, p. 7, ACM, Rostock, Germany, June 2015. [32] M. Xie and D. Pan, Accelerometer Gesture Recognition, 2014. [18] L. Jing, Z. Cheng, Y. Zhou, J. Wang, and T. Huang, “Magic [33] J. Lin, E. Keogh, S. Lonardi, and B. Chiu, “A symbolic rep- ring: a self-contained gesture input device on finger,” in resentation of time series, with implications for streaming Proceedings of the 12th International Conference on Mobile algorithms,” in Proceedings of the 8th ACM SIGMOD and Ubiquitous Multimedia, p. 39, ACM, Lulea,˚ Sweden, Workshop on Research Issues in Data Mining and Knowledge December 2013. Discovery, pp. 2–11, ACM, San Diego, CA, USA, June 2003. [19] W. Park and S. H. Han, “An analytical approach to creating [34] J. Lin, E. Keogh, L. Wei, and S. Lonardi, “Experiencing SAX: multitouch gesture vocabularies in mobile devices: a case a novel symbolic representation of time series,” Data Mining study for mobile web browsing gestures,” International and Knowledge Discovery, vol. 15, no. 2, pp. 107–144, 2007. Journal of Human-Computer Interaction, vol. 30, no. 2, [35] A. Onishi and C. Watanabe, “Event detection using archived pp. 126–141, 2014. smart house sensor data obtained using symbolic aggregate [20] Y. Bernaerts, M. Druwe,´ S. Steensels, J. Vermeulen, and approximation,” in Proceedings of the 2011 International ¨ Conference on Parallel and Distributed Processing Techniques J. Schoning, “'e office smartwatch: development and design of a smartwatch app to digitally augment interactions in an and Applications, Las Vegas, NV, USA, July 2011. office environment,” in Proceedings of the 2014 Companion [36] Y. Rabhi, M. Mrabet, and F. Fnaiech, “Intelligent control PublicationonDesigningInteractiveSystems, pp. 41–44, ACM, wheelchair using a new visual joystick,” Journal of Healthcare Vancouver, BC, Canada, June 2014. Engineering, vol. 2018, Article ID 6083565, 20 pages, 2018. [21] S. Zhang, M. H. Ang, W. Xiao, and C. K. 'am, “Detection of [37] C. Dodd, R. Athauda, and M. T. Adam, “Designing user activities by wireless sensors for daily life surveillance: eating interfaces for the elderly: a systematic literature review,” in and drinking,” Sensors, vol. 9, no. 3, pp. 1499–1517, 2009. Proceedings of the Australasian Conference on Information [22] M. Shoaib, S. Bosch, H. Scholten, P. J. Havinga, and Systems, Hobart, Australia, December 2017. O. D. Incel, “Towards detection of bad habits by fusing [38] D. Salvi, J. B. Montalva Colomer, M. T. Arredondo, B. Prazak- smartphone and smartwatch sensors,” in Proceedings of the Aram, and C Mayer, “A framework for evaluating Ambient 12 Journal of Healthcare Engineering Assisted Living technologies and the experience of the uni- versAAL project,” Journal of Ambient Intelligence and Smart Environments, vol. 7, no. 3, pp. 329–352, 2015. [39] J. B. M. Colomer, D. Salvi, M. F. Cabrera-Umpierrez et al., “Experience in evaluating AAL solutions in living labs,” Sensors, vol. 14, no. 4, pp. 7277–7311, 2014. [40] A. Queiros, ´ A. Silva, J. Alvarelhão, N. P. Rocha, and A Teixeira, “Usability, accessibility and ambient-assisted living: a systematic literature review,” Universal Access in the Information Society, vol. 14, no. 1, pp. 57–66, 2013. [41] A. Holzinger, G. Searle, and M. Wernbacher, “'e effect of previous exposure to technology on acceptance and its im- portance in usability and accessibility engineering,” Universal Access in the Information Society, vol. 10, no. 3, pp. 245–260, [42] J. Brooke, “SUS-A quick and dirty usability scale,” Usability Evaluation in Industry, vol. 189, no. 194, pp. 4–7, 1996. International Journal of Advances in Rotating Machinery Multimedia Journal of The Scientific Journal of Engineering World Journal Sensors Hindawi Hindawi Publishing Corporation Hindawi Hindawi Hindawi Hindawi www.hindawi.com Volume 2018 http://www www.hindawi.com .hindawi.com V Volume 2018 olume 2013 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 Journal of Control Science and Engineering Advances in Civil Engineering Hindawi Hindawi www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 Submit your manuscripts at www.hindawi.com Journal of Journal of Electrical and Computer Robotics Engineering Hindawi Hindawi www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 VLSI Design Advances in OptoElectronics International Journal of Modelling & Aerospace International Journal of Simulation Navigation and in Engineering Engineering Observation Hindawi Hindawi Hindawi Hindawi Volume 2018 Volume 2018 Hindawi www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com www.hindawi.com www.hindawi.com Volume 2018 International Journal of Active and Passive International Journal of Antennas and Advances in Chemical Engineering Propagation Electronic Components Shock and Vibration Acoustics and Vibration Hindawi Hindawi Hindawi Hindawi Hindawi www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018
Journal of Healthcare Engineering – Hindawi Publishing Corporation
Published: Jul 19, 2018
You can share this free article with as many people as you like with the url below! We hope you enjoy this feature!
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.