Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

An Intelligent Gesture Classification Model for Domestic Wheelchair Navigation with Gesture Variance Compensation

An Intelligent Gesture Classification Model for Domestic Wheelchair Navigation with Gesture... Hindawi Applied Bionics and Biomechanics Volume 2020, Article ID 9160528, 11 pages https://doi.org/10.1155/2020/9160528 Research Article An Intelligent Gesture Classification Model for Domestic Wheelchair Navigation with Gesture Variance Compensation 1 1 1 H. M. Ravindu T. Bandara , K. S. Priyanayana , A. G. Buddhika P. Jayasekara, 1 2 D. P. Chandima , and R. A. R. C. Gopura Intelligent Service Robotic Group, Department of Electrical Engineering, University of Moratuwa, Moratuwa 10400, Sri Lanka Bionics Laboratory, Department of Mechanical Engineering, University of Moratuwa, Moratuwa 10400, Sri Lanka Correspondence should be addressed to H. M. Ravindu T. Bandara; ravitharaka11@gmail.com Received 28 August 2019; Revised 2 December 2019; Accepted 4 January 2020; Published 30 January 2020 Academic Editor: Andrea Cereatti Copyright © 2020 H. M. Ravindu T. Bandara et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Elderly and disabled population is rapidly increasing. It is important to uplift their living standards by improving the confidence towards daily activities. Navigation is an important task, most elderly and disabled people need assistance with. Replacing human assistance with an intelligent system which is capable of assisting human navigation via wheelchair systems is an effective solution. Hand gestures are often used in navigation systems. However, those systems do not possess the capability to accurately identify gesture variances. Therefore, this paper proposes a method to create an intelligent gesture classification system with a gesture model which was built based on human studies for every essential motion in domestic navigation with hand gesture variance compensation capability. Experiments have been carried out to evaluate user remembering and recalling capability and adaptability towards the gesture model. Dynamic Gesture Identification Module (DGIM), Static Gesture Identification Module (SGIM), and Gesture Clarifier (GC) have been introduced in order to identify gesture commands. The proposed system was analyzed for system accuracy and precision using results of the experiments conducted with human users. Accuracy of the intelligent system was determined with the use of confusion matrix. Further, those results were analyzed using Cohen’s kappa analysis in which overall accuracy, misclassification rate, precision, and Cohen’s kappa values were calculated. 1. Introduction domestic environment without getting help creates common issues like anxiety, anger, and depression which leads to poor Assistive technology for elderly and disabled people is an health conditions [4]. Therefore, it is obvious that assistive technologies should upgrade in a more intelligent manner expeditiously growing field [1, 2]. Many researches are to make human life more comfortable and healthier [5]. focused on edifying living standards of human life. Common issue with most elderly and disabled persons is navigation. Human prefers to use multiple modalities such as voice and gestures to interact with each other in a domestic envi- Since it is hard for them to move around, they need some assistance from another person or a machine. However, ronment [6–8]. Gestures included hand gestures, facial expressions, or cues which are difficult to understand even assisting is not sufficient when there are communication for human beings. Furthermore, vocal and gestural expres- problems [3]. It is hard to navigate in a domestic environ- ment with difficulties to communicate accurately. Disabled sions can be integrated to create navigation commands [9]. As an example, a person in a wheelchair might say “go there” and elderly people are increasingly observed with speech dis- orders such as apraxia of speech, stuttering, and dysarthrias. and the person can integrate hand gesture to the sentence by showing which direction that he wants to move [10]. Vocal Hence, vocal interaction becomes difficult to those among command may contain uncertain terms such as distance this community. Moreover, incapability to navigate in a 2 Applied Bionics and Biomechanics Another method has been proposed in [21] to recognize and direction expressing terms like “near,”“far,”“middle,” “left,” and “right” or place expressing terms like “here” and dynamic hand gestures using a leap motion sensor. This sys- “there” [11]. Interpreting uncertain terms are a difficult task tem can recognize simple dynamic gestures such as swipe, tap, and drawing circle to authenticate logins. However, this for a robot. Moreover, such interpretation depends on vari- ous factors such as user experience, eyesight, environment, system cannot recognize complex dynamic gestures used in and cognitive feedbacks from the environment [12–14]. a domestic navigation task. A dynamic and static gesture rec- Consideration of those factors makes vocal command ognition method has been proposed in [22] for an assistive interpretation extremely difficult. However, using an intelli- robot. This system can recognize simple dynamic gestures such as waving and nodding while simple pointed gestures gent system with a capability to understand such phrases can be unlikely to be used by humans because small error can be identified in locating places. However, this system or misinterpretation made by the system can do critical dam- cannot recognize dynamic motion commands of hand and age to a disabled or elderly human being. static commands used for navigation. Another weakness of Most elderly people show speech difficulties which make this system is the lack of flexibility in using separate fingers and the lack of real-time gesture recognition. Most dynamic it difficult to clarify voice commands given by them using speech recognition systems. Moreover, voice commands and static gestures use separate fingers. include various types of uncertainties such as time related, The hand recognition system proposed in [23] is using frequency, distance, and direction which make it hard to a Kinect sensor to get the depth map and the color map. understand. As an example, “go there” and “come here” have The use of the depth map with a color map has increased the robustness of the gesture recognition, and the Finger- position uncertainty and commands like “wait here” and “give me a minute” have time-related uncertainty. Gestures Earth Mover’s Distance method has been used to remove are widely used in navigation and also have variances any input variation or the color distortions. As this [15]. However, comparing to vocal commands, gestures method only considers distance between fingers, move- show a more detailed instruction which lead to a more ment of fingers against each other will not be detected. accurate decision of an intelligent system. If an intelligent These types of finger tremors cause gesture variances system is capable of interpreting gestures into navigation which will not be recognized in this setup. The purpose commands and a person could give navigation commands of this article is to develop a simple yet unique gesture using hand gestures for all the essential navigation tasks, it system to help navigate in domestic environments com- will be a more simple and efficient method for systems pensating above-mentioned gesture variances. The method like intelligent wheelchairs [16]. Such systems should pos- proposed in [24] has used depth image to identify real- sess a capability to interpret hand gestures while under- time dynamic hand gestures through a Hidden Markov standing variances and would be like gesture alphabet for Model (HMM). Dynamic hand gesture variances consider- navigation. Moreover, those gestures should be easily ing hand orientations, speed, and styles have been consid- remembered and able to express every essential task that ered in this system. However, miniscule variances such as may be required by a disabled or elderly person on a finger orientations, finger bone orientations, and finger wheelchair. Furthermore, misunderstanding of unintended speeds have not been considered in this system. There is hand movements can create critical situations. Therefore, another method that has used the HMM to space-time safety precautions should be also considered [17]. hand movement pattern in a 3D space [25]. In this A method has been proposed in [18] to recognize method, they have considered hand movement, palm ori- dynamic hand gestures of human hand using a RGB depth entation in a 3D space to compensate for the hand gesture camera. The system is capable of automatically recognizing variances or tremors. However, it fails to identify the fin- hand gestures against a complicated background. However, ger movements against the palm orientation usually seen the system is capable of identifying limited navigation com- among elderly. There are many hand gesture recognition mands, and the system is designed to control a mobile robot systems that have been developed in order to recognize using hand gestures. Therefore, the robot can perform only most static and dynamic hand gestures. However, very the basic robot motions. A real-time hands-free immersive few have tried to compensate the involuntary hand gesture image navigation system that can respond to various gestures variance. Systems introduced in [26, 27] have tried to and voice commands has been proposed in [19]. The system define more features in order to minimize all static and has a capability to identify a wide range of hand and finger dynamic variances or tremors. To avoid overfitting and gestures and voice commands using Kinect and leap motion redundancy, they have used 2 level classifier fusion to filter sensors. However, the system is specifically designed for out the unnecessary features. Even with about 44 features, image navigation, and it does not possess any motion naviga- individual classifiers, and 2 level fusions, the system in tion understanding capability. [26] has failed to compensate the finger tremors. Since An intelligent wheelchair with hand gesture recogni- they [26, 27] have not considered finger angles against tion facility is developed in [20]. The wheelchair can be the palm orientation or bone angles, fusion of those fea- controlled through basic hand gestures such as FOR- tures into their methods become tediously difficult. The WARD, BACKWARD, and RIGHT/LEFT. However, this system developed in [28] has introduced a gesture vocab- wheelchair is not capable of recognizing more complex ulary to operate a mobile phone as opposed to the system static and dynamic gestures, and recognition of tasks is proposed in this article. However, this system has consid- not in real time. ered both large scale hand gestures and small scale Applied Bionics and Biomechanics 3 State identification module Gesture recognition State controlling module Static gesture identification module Gesture clarifier Dynamic gesture identification module Gesture identifier Gesture memory Figure 1: System overview. Therefore, tremors in the elderly people will not be a gestures in which miniscule gesture variances matter. Bayesian linear classifier has been used in small scale ges- cause of confusion for the navigation system. The overall functionality of the proposed system is explained in Section tures while HMMs have been used in large scale gestures. However, finger movements, bone angles, or finger orien- 2. The proposed concept to create a gesture model and fea- tations which were not considered in the features and var- ture extraction process is explained in Section 3. Experi- iances in both static and dynamic gestures will not be mental results are presented and discussed in Section 4. compensated by using individual classifiers. Finally, the conclusion is presented in Section 5. Therefore, this paper presents a novel method to rec- ognize dynamic and static motion-related hand gestures 2. System Overview even with tremors, based on a gesture classification model for wheelchair users with speech disorders. A complete Overall functionality of the proposed system is shown gesture model with essential navigation commands is in Figure 1. The proposed system is capable of identi- defined. It can be used to navigate an intelligent wheel- fying static and dynamic gestures and interpreting those chair through a domestic environment. Elaborated feature gestures into navigation commands. Gesture Memory set is extracted in order to compensate for user variances (GM) is built based on identified gestures from a human that occur in gestures. study which are capable of creating every essential navi- The purpose of this article is to develop a hand gesture gation task in a domestic environment. Moreover, user’s model to help a wheelchair user to navigate in a domestic capability to remember and recall gestures is also environment. Therefore, the gestures designed have to evaluated. cover all possible navigation scenarios. These gestures will Gesture recognition module extracts the information of vary as static, dynamic, palm, and finger gestures. A sys- hand skeleton using a leap motion sensor, and extracted tem should be able to recognize not only both static and data is sent to the Gesture Clarifier (GC) for clarification dynamic gestures, but it should be able to compensate of gestures into static and dynamic gestures based on ges- hand and finger tremors happening among elderly. A sys- ture features. Static Gesture Identification Module (SGIM) tem should be able to identify different variations of the and Dynamic Gesture Identification Module (DGIM) same gesture from one user to the other. In summary, understand and identify the navigation command related none of the above existing systems was not specifically to the observed gesture. State Identification Module designed as a gesture model for navigation. There were (SIM) works together with GC, SGIM, DGIM, and State few which worked as a sign language gesture model. How- Controlling Module (SCM) in order to differentiate ges- ever, those gesture vocabularies will not be effective for tures and unintended hand movements. SCM understands the purpose of this article. Gesture recognition methods user requirement to use a gesture identification system by and tools used in the above systems have focused in the controlling most prioritized gesture commands such as accuracy of a gesture. Some systems have considered ges- “Turn on” and “Turn off.” ture variances caused by palm tremors. However, none of them has considered finger tremors and finger bone 3. Gesture Model angles as possibilities. In this article, we are not only focusing on developing a specifically designed gesture 3.1. Human Study I: Identification of Navigation Commands. vocabulary but also considering all possible variations of Natural human communication consists of multiple the same gesture. modalities like voice and hand gestures. Therefore, defined 4 Applied Bionics and Biomechanics Turn slightly right participants used hand movements. Other important ten- 6% dency was that participants liked to use both static and Others Turn slightly le dynamic gestures more evenly. These commands are also 6% 6% found to be two types: finger-pointing gestures and palm- opening gestures. Numbers of fingers used by the partic- Slow down ipants were unpredictable in pointing gestures, and 2% Turn around mainly one finger or two fingers were used. Dynamic ges- 5% Go forward tures were mainly used to express movements and direc- Stop 40% tions that a wheelchair needs to execute. 5% 3.3. Hand Gesture System. Navigation commands of a Turn right wheelchair user should cover all the possible navigation 15% scenarios. If the user has vocal abilities, the commands will Turn le 15% include information covering exact instructions. For an intelligent wheelchair to work through only hand gestures, they should be simple, clear, and accurate. The proposed Figure 2: Frequency of navigation commands. gesture system is based on all basic navigation scenarios. These hand gestures are simple and clear. Out of the hand gestures defined, dynamic gestures were used to represent hand gestures should have been able to replace all possible navigational commands. In order to identify the com- motion instructions. Defined hand gestures are given in Figure 3. mands used by wheelchair users during basic navigation, The gesture system was built based on the following a human study was conducted. 20 wheelchair users of age 55 to 70 have participated in the study. Participants considerations. were asked to guide their wheelchair using hand gestures (1) Defined hand gestures should be simple, clear, and or voice or multimodal interaction. Natural navigation accurate command identification was the priority. Hence, interac- tion method was not limited to hand gestures. Location (2) Gestures should be defined in a way that a user can is changed in order to cover all possible navigation scenar- navigate through a path using a minimum number ios. Participants did not have any prior knowledge of the of gestures locations or the previous study results. Hence, the accu- racy of the results was ensured, and repetition of results (3) A user should be able to remember and recall the was avoided. All possible navigation commands were defined hand gestures. To ensure user’s adjustabil- recorded, most frequent commands were identified, and ity to the gesture system, a human study was con- the graphical representation of the identified command ducted. Details of this study are explained in frequencies is given in Figure 2. Sections 4.1 and 4.2 Most frequent commands identified above were consid- (4) Significant difference should be identified among ered for the proposed gesture system. hand gestures. Therefore, users will not have any confusion with gestures 3.2. Human Study II: Hand Gesture Identification. A human study was conducted in order to understand the (5) A hand gesture system should have both static and hand gesture features used by wheelchair users for the dynamic gestures in order to mitigate inaccuracies identified navigational commands. A group of 20 people caused by the leap motion sensor randomly selected from the same age group (55 to 70) 3.4. Feature Extractions. Hand gestures accompanied with participated in the study. Participants were asked to exe- cute the basic navigation commands, identified in the vocal interaction tend to be both voluntary and involun- human study I using only the hand gestures. Data col- tary. These gestures carry information such as direction lected in this study were used to build the gesture system and motion. For a wheelchair user with vocal disabilities, that will be elaborated later. A leap motion sensor was these hand gestures could be considered as the primary modality. Even though there are gesture systems such as used to track hand gestures, and raw data collected through that were processed to identify the gesture fea- American sign language, the execution of these gestures tures. Most predominant hand feature associated in exe- differs from one elderly person to the other. To compen- cuting each command was recorded. Results are shown sate for this variation, bone angles as explained below in Table 1. Frequently used hand features for each gesture were used. Defined gesture system consists of two main forms of were used as a basis in feature extraction. Two main types of hand gestures were identified as gestures: dynamic gestures and static gestures. Static ges- static gestures (pointers or poses) and dynamic gestures tures are nonmoving hand poses which can be modeled (hand movements). Static gestures were mainly used in through basic hand features. Dynamic gestures are mod- subtle motion commands like Stop, Turn around, and eled using dynamic hand features like finger movement Turn slightly left/right. For vigorous motion commands, and hand movement. Applied Bionics and Biomechanics 5 Table 1: An analysis of hand feature frequencies associated with navigation commands. Fingers Navigation command Palm orientation Palm movement Fingertip movement Finger bones Single finger Multiple fingers Go forward 92% 32% 28% 6% 44% 56% Turn left 84% 75% 42% 18% 8% 48% Turn right 82% 75% 44% 16% 7% 52% Stop 96% 68% 64% 24% 18% 74% Turn around 98% 56% 86% 77% 14% 84% Slow down 98% 87% 54% 21% 19% 72% Turn slightly left 90% 34% 97% 42% 47% 52% Turn slightly right 89% 35% 98% 44% 46% 53% (a) (b) (c) (d) (e) (f) (g) (h) (i) (j) (k) (l) Figure 3: Navigation gestures: (a) Go forward, (b) Stop, (c) Go backward, (d) Hard left, (e) Hard right, (f) slightly left, (g) Slightly right, (h) Turn around, (i) Slow down, (j) Go faster, (k) Turn off, and (l) Turn on. (1) Palm orientation. Palm orientation was taken based be addressed by Quaternion angles. The main lim- on leap motion coordinates. The pitch angle, roll itation of using Euler angles is that difficulty in angle, and yaw angle of the palm depict the ori- interpolating between two orientations of an object entation. Pitch angle is the angle rotated around smoothly [29] the +Y axis, roll angle is the angle rotated around (2) Finger bone angles. Bone angles of fingers with the +X axis, and Yaw angle is the angle rotated respect to the metacarpal bone of the hand are around the +Z axis. As illustrated in Figure 4, the extracted. These angles are shown in Figure 5. Hence, Quaternion angle theory is used to take the yaw, even when (i) Slow down, (j) Go faster, (k) Turn off, pitch, and roll of the x, y, z vectors relative to a single and (l) Turn on, fingers have improper position that vector. Usually Euler angles are used as it has the will not affect the gesture recognition. As shown in ability to take the vectors relative to each other. Figure 5, the angles of distal (α), proximal (β), and But Euler angles have certain limitation that can 6 Applied Bionics and Biomechanics Mean fingertip Index fingertip velocity (F ) velocity (F) Mean bone angles Index bone angles (M1, (B1, B2, B3) M2, M3) Middle bone angles Roll angle (P1) (B4, B5, B6) Pitch angle (P2) Yaw angle (P3) Palm velocity (V1) umb bone angles (T1, T2) Figure 4: Features extracted from hand skeleton. +X +Z +Y → Figure 5: Hand features. intermediate (γ) bones with respect to the metacarpal fingers were considered specifically since most of bone were calculated using Equation (1). These the navigational gestures identified were associated angles were taken for the index finger and middle with them. As the ring finger and pinky finger finger. For the thumb finger, only the distal and are tightly associated couple, the average of distal, proximal, and intermediate angles was considered. proximal bone angles were taken as the thumb does not have an intermediate bone. These three Navigational gestures defined have sole ring or Applied Bionics and Biomechanics 7 F = fingertip velocity of Finger 1 F = fingertip velocity of Finger 2 F = mean value for fingertip velocity of Fingers 3 and 4 K = captured frame from a leap motion sensor n = frames taken from a leap motion sensor from 0.25 s intervals Require:: F , F , F, V 1 2 1 Ensure: : Activation Module for ‘K’ in n do if F > 0or F > 0orF > 0 and V > 0 then 1 2 1 Change state to waiting end if end for if F > 0or F > 0orF > 0 and V > 0 then 1 2 1 Activate DGIM Activate SCM else Activate SGIM end if Algorithm 1: Gesture Clarification Algorithm. pinky finger features. But it was important to get Require: : activate separate features for other three fingers as they Ensure: : DGIM activation were included separately in the hand gestures. if activate = 0 then Here, the direction of metacarpal bone, proximal DGIM =1 activate bone, intermediate bone, and distal bone is if gesture = ‘Turn off’ then ! ! ! ! denoted by p, q, r , and u, respectively. Activate waiting state DGIM =0 activate else Deactivate waiting state p = b − a, DGIM =1 activate end if ! ! q = c − b, end if ! ! r = d − c , ð1Þ Algorithm 2: DGIM Activation Algorithm ! ! u = e − d , u ⋅ p −1 α = cos : 3.5. Gesture Classification. Artificial Neural Networks u ⋅ p kk  (ANNs) have been developed to identify and clarify dynamic and static gestures. Each Static Gesture Identification Module (SGIM) and Dynamic Gesture Identification Module (DGIM) consist of an ANN. Gesture Clarifier (GC) consists The calculation of other two angles was done using the of Algorithms 1 and 2 to distinguish dynamic gestures from same approach static gestures. GC priorities dynamic gestures since critical (3) Fingertip velocity. To detect the dynamic gestures commands like “Turn off” and “Turn on” are defined in DGIM. It controls system state based on prioritized com- defined, fingertip velocity of the index finger was mands. If received navigation command was “Turn off,” the considered. Two different inputs were considered GC will isolate GI from DGIM and SGIM and wait for the for both magnitude and direction of the velocity next command to be “Turn on.” Moreover, when a gesture vectors. Also, mean fingertip velocity of other fin- confirmation is identified by GC and SCM, the appropriate gers was considered to detect finger movements. submodule will be activated. All the properties considered are shown in SGIM consists of an ANN that has 14 inputs (B1, B2, Figure 4 B3, B4, B5, B6, M1, M2, M3, T1, T2, P1, P2, and P3). (4) Palm velocity. To detect the palm movement of the There are two hidden layers in that ANN, and the output hand, palm velocity magnitude and direction were layer has four outputs (N1, N2, N3, and N4). The output considered as inputs. Palm orientation angles were of the SGIM represents a static navigation command number from 1 to 12. DGIM consists of an ANN that also input features to detect dynamic gestures 8 Applied Bionics and Biomechanics (a) (b) Figure 6: Research platform during experiment session. Table 2: Result of experiment I. has 5 inputs (C1, C2, Q1, Q2, P1, P2, P3, and V1) and 4 outputs (N5, N6, N7, and N8). The output of the DGIM Command Navigation Exp. Exp. Exp. Exp. represents a dynamic navigation command number from no. command 1 2 3 4 1 to 12. Both outputs of the SGIM and DGIM were in binary a Go forward 98% 99% 99% 99% numbers. Both ANNs use a sigmoidal function as the activa- b Stop 99% 99% 98% 100% tion function. c Go backward 98% 98% 97% 98% d Hard left 95% 97% 97% 94% 4. Result and Discussion e Hard right 95% 97% 97% 95% To evaluate the validity and accuracy of the proposed intelli- f Slightly left 94% 95% 98% 89% gent gesture system, experiments were carried out from two g Slightly right 93% 95% 98% 91% aspects: (a) accuracy of remembering and recalling of the h Turn around 95% 95% 98% 98% defined gestures and (b) accuracy and robustness of the intel- i Slow down 96% 97% 97% 99% ligent hand gesture recognition system. System was imple- j Go faster 95% 96% 98% 99% mented on the intelligent wheelchair explained in [20]. To k Turn off 100% 100% 98% 100% carry out the experiments, a group of participants of 20 l Turn on 99% 100% 100% 100% wheelchair users were randomly selected. They were selected from three age groups of 20 to 30, 30 to 55, and over 55 years. Participants were generally healthy with no cognitive impair- ments except for mobility impairment of legs. The research navigation command. Percentage accuracy of recalling the platform during the experiment session is shown in hand gesture for each navigation command was recorded Figures 6(a) and 6(b). as Exp. 1. In the next step, each participant was asked to The implementation of the developed intelligent sys- recall the navigation command for a randomly given hand tem requires a high-performing smart wheelchair with fast gesture. Percentage accuracy of recalling the navigation and reliable computing power. For this purpose, we used a command for each hand gesture was recorded as Exp. 2. wheelchair robot which is developed in our laboratory that After that, participants were asked to recall all the naviga- has basic navigational capabilities. In this wheelchair, we tion commands in one go. Percentage accuracy of recalling have installed an industrial grade high-end computer in a navigation command was recorded for this step as Exp. which DDR4 SO-DIMM memory is 32 GB and processor 3. Finally, each participant was given a fixed navigation is a 6th generation i7 quad core (3.6 GHZ). Also, to path, and they were asked to guide themselves with hand increase the computational capacity, a SSD memory of gestures defined in the system. Navigation path was 1 TB is installed. To compensate for the high performance selected considering all the navigation commands identi- ° ° and rugged operation, it can withstand from -20 Cto60 C fied. Percentage accuracy of remembering each gesture in temperature. These are essential for the intelligent system a task situation was recorded as Exp. 4. Recorded data is to work properly since training and execution will take a presented in Table 2. Boxplots given in Figures 7(a) and lot of computational power. 7(b) show the remembering and recalling capability of each dynamic and static hand gesture. 4.1. Experiment I. A detailed presentation of the navigation commands and relevant hand gestures was shown to each participant. They were asked to memorize the commands 4.2. Experiment II. Participants were given a specific naviga- and gestures for 15 minutes. Then, each participant was tion task to complete using hand gestures. Navigation path of asked to recall the relevant hand gesture for randomly given the task was planned in a way that all gestures were utilized. Applied Bionics and Biomechanics 9 a b c d e (a) fg h i j k l (b) Figure 7: Boxplots to represent remembering and recalling capability of each dynamic and static hand gesture. Table 3: Confusion matrix for identification of static gestures. ab c d e a 0.99 0.01 b 0.97 0.03 c 0.02 0.98 C G 12 m d 0.01 0.99 e 0.01 0.99 .95 confidence interval Observed Standard kappa error Lower limit Upper limit 0.9648 0.0145 0.9363 0.9933 ticular hand gesture are given in the confusion matrix given in Tables 3–5. 13 m In the experiment I, participants showed almost per- fect memory of basic navigation commands such as “Go Figure 8: Experiment II setup. Participants were asked to give gesture forward,”“Stop,”“Go backward,” and “Turn on/off” com- instructions for the path A > B > C > D > E > F > E > D > C > G. mands. Recalling accuracy percentage of most navigation commands was in the high 90s except for “Slightly right/- Navigation task and fixed path are given in Figure 8. Each left” commands. As mentioned in Table 2, Exp. 2 accura- participant had to guide themselves using the hand ges- cies are higher than Exp. 1. Therefore, it can be deduced tures, and the proposed system classified the hand ges- that recalling navigation command for hand gesture is tures. This process was repeated for all the participants. easier. Exp. 4 values are slightly lower than other accu- System recognition accuracies were recorded for each hand racy values. Recalling hand gestures during a task is gesture. Rates of success and failure in recognizing a par- tougher than in any situation. Overall, almost all accuracy 10 Applied Bionics and Biomechanics Table 4: Confusion matrix for identification of dynamic gestures. 5. Conclusions fg h i jkl This paper proposed a novel method to identify hand ges- tures related to navigation based on a gesture recognition f 0.95 0.01 0.04 model with compensations for user variances. An intelligent g 0.01 0.99 gesture identification system was introduced in order to h 0.04 0.96 0.01 clarify gestures with high precision. Bone angles with i 0.03 0.97 respect to metacarpal bone were introduced as novel fea- j 0.03 0.97 tures in order to elevate identification of gesture variances. k 1.00 The system is capable of eliminating complications due to l 1.00 user inability in executing precise hand gestures. An intelli- gent clarification system has been implemented to separate .95 confidence interval Observed Standard static and dynamic hand gestures. Experimental results kappa error Lower limit Upper limit confirmed that the wheelchair users with speech disabilities 0.9879 0.003 0.9819 0.9939 can remember and recall the proposed hand gesture system. Therefore, the proposed gesture model can be considered as user friendly, and it is concluded that the proposed intelli- Table 5: Confusion matrix for identification of dynamic and static gent gesture recognition system can recognize user hand gestures. gestures with a high accuracy. Static Dynamic Data Availability Static 24 1 25 Dynamic 2 23 25 The data used to support the findings of this study are 26 24 included within the article. Accuracy 0.94 Misclassification rate 0.06 Conflicts of Interest Precision 0.95 The authors declare that they have no conflicts of interest. Cohen’s kappa 0.88 (>0.81) Acknowledgments values are higher than 90% and for most critical gestures such Turn on/off has almost perfect recalling accuracy. This work was supported by the National Research Council Therefore, it can be proved that the proposed gesture sys- Grant Number 17-069 and the Center for Advanced Robotics tem is user friendly and easy to memorize. (CAR), University of Moratuwa. In the experiment II, three confusion matrices were cre- ated in order to validate the recognition accuracies. For the References two hand gesture recognition systems, static and dynamic, recognition accuracies were shown for each hand gesture. [1] S. Yusif, J. Soar, and A. Hafeez-Baig, “Older people, assistive For all the static gestures, accuracy is over 90% as technologies, and the barriers to adoption: a systematic review,” International Journal of Medical Informatics, vol. 94, shown in the confusion matrix given in Table 3. In the pp. 112–116, 2016. static gesture matrix, Cohen’s kappa value was calculated [2] R. Mostaghel, “Innovation and technology for the elderly: sys- with linear weighting. Used weights were equal in the tematic literature review,” Journal of Business Research, vol. 69, static confusion matrix. For all the dynamic gestures, rec- no. 11, pp. 4896–4900, 2016. ognition accuracy is over 90% and overall accuracy is [3] C. Aouatef, B. Iman, and C. Allaoua, “Multi-agent system in higher than the static gesture recognition system. Hence, ambient environment for assistance of elderly sick peoples,” the use of a high number of dynamic gestures than the in Proceedings of the International Conference on Intelligent static gestures for the system is validated. Cohen’s kappa Information Processing, Security and Advanced Communica- value was also calculated for this matrix as shown in tion, pp. 1–5, ACM, 2015. Table 4 with linear weighting. Critical dynamic gestures [4] M. Kavussanu, C. Ring, and J. Kavanagh, “Antisocial behavior, such as “Turn on/off” were weighted with two points moral disengagement, empathy and negative emotion: A com- and other gestures with one point. Since kappa values parison between disabled and able-bodied athletes,” Ethics & for both recognition systems are over 0.81, it can be Behavior, vol. 25, no. 4, pp. 297–306, 2015. proved that the systems are working properly. For the [5] M. Mast, M. Burmester, B. Graf et al., “Design of theˇ human- gesture type selection system, a confusion matrix was cre- robot interaction for a semi-autonomous service robot to assist ated and overall accuracy, misclassification rate, precision, elderly people,” in Ambient Assisted Living, pp. 15–29, and Cohen’s kappa values were calculated. Overall, accu- Springer, 2015. racy is 0.94 (>0.90) and kappa value is over 0.81. There- [6] M. A. V. Muthugala, P. H. D. A. Srimal, and A. G. B. Jayase- fore, it can be concluded that selection system is also kara, “Enhancing interpretation of ambiguous voice instruc- working properly. tions based on the environment and the user’s intention for Applied Bionics and Biomechanics 11 Fifth National Conference on Computer Vision, Pattern Recog- improved human-friendly robot navigation,” Applied Sciences, vol. 7, no. 8, p. 821, 2017. nition, Image Processing and Graphics (NCVPRIPG), pp. 1–4, IEEE, 2015. [7] M. Foukarakis, M. Antona, and C. Stephanidis, “Applying a multimodal user interface development framework on a [20] H. G. M. T. Yashoda, A. M. S. Piumal, P. G. S. P. Polgahapitiya, domestic service robot,” in Proceedings of the 10th Interna- M. M. M. Mubeen, M. A. V. J. Muthugala, and A. G. B. P. Jaya- tional Conference on PErvasive Technologies Related to Assis- sekara, “Design and development of a smart wheelchair with tive Environments, pp. 378–384, ACM, 2017. multiple control interfaces,” in 2018 Moratuwa Engineering Research Conference (MERCon), pp. 324–329, IEEE, 2018. [8] C. Georgoulas, A. Raza, J. Güttler, T. Linner, and T. Bock, “Home environment interaction via service robots and the [21] A. Chan, T. Halevi, and N. Memon, “Leap motion controller leap motion controller,” in Proceedings of the 31st Interna- for authentication via hand geometry and gestures,” in Inter- tional Symposium on Automation and Robotics in national Conference on Human Aspects of Information Secu- ConstructionISARC. rity, Privacy, and Trust, pp. 13–22, Springer, Cham, 2015. [9] J. Guerrero-García, C. González, and D. Pinto, “Studying user- [22] G. Canal, S. Escalera, and C. Angulo, “A real-time human- defined body gestures for navigating interactive maps,” in Pro- robot interaction system based on gestures for assistive scenar- ceedings of the XVIII International Conference on Human ios,” Computer Vision and Image Understanding, vol. 149, Computer InteractionACM. pp. 65–77, 2016. [10] P. Viswanathan, E. P. Zambalde, G. Foley et al., “Intelligent [23] Z. Ren, J. Meng, J. Yuan, and Z. Zhang, “Robust hand gesture wheelchair control strategies for older adults with cognitive recognition with kinect sensor,” in Proceedings of the 19th impairment: user attitudes, needs, and preferences,” Autono- ACM international conference on Multimedia, pp. 759-760, mous Robots, vol. 41, no. 3, pp. 539–554, 2017. ACM, 2011. [11] M. A. V. J. Muthugala and A. G. B. P. Jayasekara, “Interpreting [24] A. Kurakin, Z. Zhang, and Z. Liu, “A real time system for uncertain information related to relative references for dynamic hand gesture recognition with a depth sensor,” in improved navigational command understanding of service 2012 Proceedings of the 20th European signal processing confer- robots,” in Proceedings of the 2017 IEEE/RSJ International ence (EUSIPCO), pp. 1975–1979, IEEE, 2012. Conference on Intelligent Robots and Systems, pp. 6567–6574, [25] Y. Nam and K. Wohn, “Recognition of space-time hand- IROS, 2017. gestures using hidden Markov model,” in Proceedings of the [12] H. R. T. Bandara, M. V. J. Muthugala, A. B. P. Jayasekara, and ACM symposium on virtual reality software and technology, D. P. Chandima, “Grounding object attributes through inter- pp. 51–58, ACM, 1996. active discussion for building cognitive maps in service [26] J. Singha and R. H. Laskar, “Recognition of global hand ges- robots,” in Proceedings of the 2018 IEEE International Confer- tures using self co-articulation information and classifier ence on, Systems, Man, and Cybernetics (SMC)IEEE. fusion,” Journal on Multimodal User Interfaces, vol. 10, no. 1, [13] H. M. R. T. Bandara, M. A. V. Muthugala, A. G. B. P. Jayase- pp. 77–93, 2016. kara, and D. P. Chandima, “Cognitive spatial representative [27] J. Singha and R. H. Laskar, “Hand gesture recognition using map for interactive conversational model of service robot,” in two-level speed normalization, feature selection and classifier Proceedings of the ROMAN 2018 - IEEE International Confer- fusion,” Multimedia Systems, vol. 23, no. 4, pp. 499–514, 2017. ence on Robot and Human Interactive CommunicationIEEE. [28] Z. Lu, X. Chen, Q. Li, X. Zhang, and P. Zhou, “A hand gesture [14] M. V. J. Muthugala and A. B. P. Jayasekara, “Enhancing recognition framework and wearable gesture-based interaction humanrobot interaction by interpreting uncertain information prototype for mobile devices,” IEEE Transactions on Human- in navigational commands based on experience and environ- Machine Systems, vol. 44, no. 2, pp. 293–299, 2014. ment,” in Proceedings of the 2016 IEEE International Confer- ence on Robotics and Automation (ICRA), pp. 2915–2921, [29] N. H. Hughes, Quaternion to/from Euler angle of arbitrary IEEE, 2016. rotation sequence & direction cosine matrix conversion using [15] H. M. R. T. Bandara, B. M. S. S. Basnayake, A. G. B. P. Jayase- geometric methods, vol. 2, 2017. kara, and D. P. Chandima, “Cognitive navigational command identification for mobile robots based on hand gestures and vocal navigation commands,” in Proceedings of the 2nd Inter- national Conference on Electrical Engineering(EECon)IEEE. [16] E. Wästlund, K. Sponseller, O. Pettersson, and A. Bared, “Evaluating gaze-driven power wheelchair with navigation support for persons with disabilities,” Journal of Rehabilita- tion Research and Development, vol. 52, no. 7, pp. 815–826, [17] S. Pushp, B. Bhardwaj, and S. M. Hazarika, “Cognitive decision making for navigation assistance based on intent recognition,” in International Conference on Mining Intelligence and Knowl- edge Exploration, pp. 81–89, Springer, 2017. [18] D. Xu, X. Wu, Y. L. Chen, and Y. Xu, “Online dynamic gesture recognition for human robot interaction,” Journal of Intelligent & Robotic Systems, vol. 77, no. 3-4, pp. 583–596, 2015. [19] M. Sreejith, S. Rakesh, S. Gupta, S. Biswas, and P. P. Das, “Real- time hands-free immersive image navigation system using Microsoft Kinect 2.0 and Leap Motion Controller,” in 2015 http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Applied Bionics and Biomechanics Hindawi Publishing Corporation

An Intelligent Gesture Classification Model for Domestic Wheelchair Navigation with Gesture Variance Compensation

Loading next page...
 
/lp/hindawi-publishing-corporation/an-intelligent-gesture-classification-model-for-domestic-wheelchair-uqCxvIg0Dq

References (35)

Publisher
Hindawi Publishing Corporation
Copyright
Copyright © 2020 H. M. Ravindu T. Bandara et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
ISSN
1176-2322
eISSN
1754-2103
DOI
10.1155/2020/9160528
Publisher site
See Article on Publisher Site

Abstract

Hindawi Applied Bionics and Biomechanics Volume 2020, Article ID 9160528, 11 pages https://doi.org/10.1155/2020/9160528 Research Article An Intelligent Gesture Classification Model for Domestic Wheelchair Navigation with Gesture Variance Compensation 1 1 1 H. M. Ravindu T. Bandara , K. S. Priyanayana , A. G. Buddhika P. Jayasekara, 1 2 D. P. Chandima , and R. A. R. C. Gopura Intelligent Service Robotic Group, Department of Electrical Engineering, University of Moratuwa, Moratuwa 10400, Sri Lanka Bionics Laboratory, Department of Mechanical Engineering, University of Moratuwa, Moratuwa 10400, Sri Lanka Correspondence should be addressed to H. M. Ravindu T. Bandara; ravitharaka11@gmail.com Received 28 August 2019; Revised 2 December 2019; Accepted 4 January 2020; Published 30 January 2020 Academic Editor: Andrea Cereatti Copyright © 2020 H. M. Ravindu T. Bandara et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Elderly and disabled population is rapidly increasing. It is important to uplift their living standards by improving the confidence towards daily activities. Navigation is an important task, most elderly and disabled people need assistance with. Replacing human assistance with an intelligent system which is capable of assisting human navigation via wheelchair systems is an effective solution. Hand gestures are often used in navigation systems. However, those systems do not possess the capability to accurately identify gesture variances. Therefore, this paper proposes a method to create an intelligent gesture classification system with a gesture model which was built based on human studies for every essential motion in domestic navigation with hand gesture variance compensation capability. Experiments have been carried out to evaluate user remembering and recalling capability and adaptability towards the gesture model. Dynamic Gesture Identification Module (DGIM), Static Gesture Identification Module (SGIM), and Gesture Clarifier (GC) have been introduced in order to identify gesture commands. The proposed system was analyzed for system accuracy and precision using results of the experiments conducted with human users. Accuracy of the intelligent system was determined with the use of confusion matrix. Further, those results were analyzed using Cohen’s kappa analysis in which overall accuracy, misclassification rate, precision, and Cohen’s kappa values were calculated. 1. Introduction domestic environment without getting help creates common issues like anxiety, anger, and depression which leads to poor Assistive technology for elderly and disabled people is an health conditions [4]. Therefore, it is obvious that assistive technologies should upgrade in a more intelligent manner expeditiously growing field [1, 2]. Many researches are to make human life more comfortable and healthier [5]. focused on edifying living standards of human life. Common issue with most elderly and disabled persons is navigation. Human prefers to use multiple modalities such as voice and gestures to interact with each other in a domestic envi- Since it is hard for them to move around, they need some assistance from another person or a machine. However, ronment [6–8]. Gestures included hand gestures, facial expressions, or cues which are difficult to understand even assisting is not sufficient when there are communication for human beings. Furthermore, vocal and gestural expres- problems [3]. It is hard to navigate in a domestic environ- ment with difficulties to communicate accurately. Disabled sions can be integrated to create navigation commands [9]. As an example, a person in a wheelchair might say “go there” and elderly people are increasingly observed with speech dis- orders such as apraxia of speech, stuttering, and dysarthrias. and the person can integrate hand gesture to the sentence by showing which direction that he wants to move [10]. Vocal Hence, vocal interaction becomes difficult to those among command may contain uncertain terms such as distance this community. Moreover, incapability to navigate in a 2 Applied Bionics and Biomechanics Another method has been proposed in [21] to recognize and direction expressing terms like “near,”“far,”“middle,” “left,” and “right” or place expressing terms like “here” and dynamic hand gestures using a leap motion sensor. This sys- “there” [11]. Interpreting uncertain terms are a difficult task tem can recognize simple dynamic gestures such as swipe, tap, and drawing circle to authenticate logins. However, this for a robot. Moreover, such interpretation depends on vari- ous factors such as user experience, eyesight, environment, system cannot recognize complex dynamic gestures used in and cognitive feedbacks from the environment [12–14]. a domestic navigation task. A dynamic and static gesture rec- Consideration of those factors makes vocal command ognition method has been proposed in [22] for an assistive interpretation extremely difficult. However, using an intelli- robot. This system can recognize simple dynamic gestures such as waving and nodding while simple pointed gestures gent system with a capability to understand such phrases can be unlikely to be used by humans because small error can be identified in locating places. However, this system or misinterpretation made by the system can do critical dam- cannot recognize dynamic motion commands of hand and age to a disabled or elderly human being. static commands used for navigation. Another weakness of Most elderly people show speech difficulties which make this system is the lack of flexibility in using separate fingers and the lack of real-time gesture recognition. Most dynamic it difficult to clarify voice commands given by them using speech recognition systems. Moreover, voice commands and static gestures use separate fingers. include various types of uncertainties such as time related, The hand recognition system proposed in [23] is using frequency, distance, and direction which make it hard to a Kinect sensor to get the depth map and the color map. understand. As an example, “go there” and “come here” have The use of the depth map with a color map has increased the robustness of the gesture recognition, and the Finger- position uncertainty and commands like “wait here” and “give me a minute” have time-related uncertainty. Gestures Earth Mover’s Distance method has been used to remove are widely used in navigation and also have variances any input variation or the color distortions. As this [15]. However, comparing to vocal commands, gestures method only considers distance between fingers, move- show a more detailed instruction which lead to a more ment of fingers against each other will not be detected. accurate decision of an intelligent system. If an intelligent These types of finger tremors cause gesture variances system is capable of interpreting gestures into navigation which will not be recognized in this setup. The purpose commands and a person could give navigation commands of this article is to develop a simple yet unique gesture using hand gestures for all the essential navigation tasks, it system to help navigate in domestic environments com- will be a more simple and efficient method for systems pensating above-mentioned gesture variances. The method like intelligent wheelchairs [16]. Such systems should pos- proposed in [24] has used depth image to identify real- sess a capability to interpret hand gestures while under- time dynamic hand gestures through a Hidden Markov standing variances and would be like gesture alphabet for Model (HMM). Dynamic hand gesture variances consider- navigation. Moreover, those gestures should be easily ing hand orientations, speed, and styles have been consid- remembered and able to express every essential task that ered in this system. However, miniscule variances such as may be required by a disabled or elderly person on a finger orientations, finger bone orientations, and finger wheelchair. Furthermore, misunderstanding of unintended speeds have not been considered in this system. There is hand movements can create critical situations. Therefore, another method that has used the HMM to space-time safety precautions should be also considered [17]. hand movement pattern in a 3D space [25]. In this A method has been proposed in [18] to recognize method, they have considered hand movement, palm ori- dynamic hand gestures of human hand using a RGB depth entation in a 3D space to compensate for the hand gesture camera. The system is capable of automatically recognizing variances or tremors. However, it fails to identify the fin- hand gestures against a complicated background. However, ger movements against the palm orientation usually seen the system is capable of identifying limited navigation com- among elderly. There are many hand gesture recognition mands, and the system is designed to control a mobile robot systems that have been developed in order to recognize using hand gestures. Therefore, the robot can perform only most static and dynamic hand gestures. However, very the basic robot motions. A real-time hands-free immersive few have tried to compensate the involuntary hand gesture image navigation system that can respond to various gestures variance. Systems introduced in [26, 27] have tried to and voice commands has been proposed in [19]. The system define more features in order to minimize all static and has a capability to identify a wide range of hand and finger dynamic variances or tremors. To avoid overfitting and gestures and voice commands using Kinect and leap motion redundancy, they have used 2 level classifier fusion to filter sensors. However, the system is specifically designed for out the unnecessary features. Even with about 44 features, image navigation, and it does not possess any motion naviga- individual classifiers, and 2 level fusions, the system in tion understanding capability. [26] has failed to compensate the finger tremors. Since An intelligent wheelchair with hand gesture recogni- they [26, 27] have not considered finger angles against tion facility is developed in [20]. The wheelchair can be the palm orientation or bone angles, fusion of those fea- controlled through basic hand gestures such as FOR- tures into their methods become tediously difficult. The WARD, BACKWARD, and RIGHT/LEFT. However, this system developed in [28] has introduced a gesture vocab- wheelchair is not capable of recognizing more complex ulary to operate a mobile phone as opposed to the system static and dynamic gestures, and recognition of tasks is proposed in this article. However, this system has consid- not in real time. ered both large scale hand gestures and small scale Applied Bionics and Biomechanics 3 State identification module Gesture recognition State controlling module Static gesture identification module Gesture clarifier Dynamic gesture identification module Gesture identifier Gesture memory Figure 1: System overview. Therefore, tremors in the elderly people will not be a gestures in which miniscule gesture variances matter. Bayesian linear classifier has been used in small scale ges- cause of confusion for the navigation system. The overall functionality of the proposed system is explained in Section tures while HMMs have been used in large scale gestures. However, finger movements, bone angles, or finger orien- 2. The proposed concept to create a gesture model and fea- tations which were not considered in the features and var- ture extraction process is explained in Section 3. Experi- iances in both static and dynamic gestures will not be mental results are presented and discussed in Section 4. compensated by using individual classifiers. Finally, the conclusion is presented in Section 5. Therefore, this paper presents a novel method to rec- ognize dynamic and static motion-related hand gestures 2. System Overview even with tremors, based on a gesture classification model for wheelchair users with speech disorders. A complete Overall functionality of the proposed system is shown gesture model with essential navigation commands is in Figure 1. The proposed system is capable of identi- defined. It can be used to navigate an intelligent wheel- fying static and dynamic gestures and interpreting those chair through a domestic environment. Elaborated feature gestures into navigation commands. Gesture Memory set is extracted in order to compensate for user variances (GM) is built based on identified gestures from a human that occur in gestures. study which are capable of creating every essential navi- The purpose of this article is to develop a hand gesture gation task in a domestic environment. Moreover, user’s model to help a wheelchair user to navigate in a domestic capability to remember and recall gestures is also environment. Therefore, the gestures designed have to evaluated. cover all possible navigation scenarios. These gestures will Gesture recognition module extracts the information of vary as static, dynamic, palm, and finger gestures. A sys- hand skeleton using a leap motion sensor, and extracted tem should be able to recognize not only both static and data is sent to the Gesture Clarifier (GC) for clarification dynamic gestures, but it should be able to compensate of gestures into static and dynamic gestures based on ges- hand and finger tremors happening among elderly. A sys- ture features. Static Gesture Identification Module (SGIM) tem should be able to identify different variations of the and Dynamic Gesture Identification Module (DGIM) same gesture from one user to the other. In summary, understand and identify the navigation command related none of the above existing systems was not specifically to the observed gesture. State Identification Module designed as a gesture model for navigation. There were (SIM) works together with GC, SGIM, DGIM, and State few which worked as a sign language gesture model. How- Controlling Module (SCM) in order to differentiate ges- ever, those gesture vocabularies will not be effective for tures and unintended hand movements. SCM understands the purpose of this article. Gesture recognition methods user requirement to use a gesture identification system by and tools used in the above systems have focused in the controlling most prioritized gesture commands such as accuracy of a gesture. Some systems have considered ges- “Turn on” and “Turn off.” ture variances caused by palm tremors. However, none of them has considered finger tremors and finger bone 3. Gesture Model angles as possibilities. In this article, we are not only focusing on developing a specifically designed gesture 3.1. Human Study I: Identification of Navigation Commands. vocabulary but also considering all possible variations of Natural human communication consists of multiple the same gesture. modalities like voice and hand gestures. Therefore, defined 4 Applied Bionics and Biomechanics Turn slightly right participants used hand movements. Other important ten- 6% dency was that participants liked to use both static and Others Turn slightly le dynamic gestures more evenly. These commands are also 6% 6% found to be two types: finger-pointing gestures and palm- opening gestures. Numbers of fingers used by the partic- Slow down ipants were unpredictable in pointing gestures, and 2% Turn around mainly one finger or two fingers were used. Dynamic ges- 5% Go forward tures were mainly used to express movements and direc- Stop 40% tions that a wheelchair needs to execute. 5% 3.3. Hand Gesture System. Navigation commands of a Turn right wheelchair user should cover all the possible navigation 15% scenarios. If the user has vocal abilities, the commands will Turn le 15% include information covering exact instructions. For an intelligent wheelchair to work through only hand gestures, they should be simple, clear, and accurate. The proposed Figure 2: Frequency of navigation commands. gesture system is based on all basic navigation scenarios. These hand gestures are simple and clear. Out of the hand gestures defined, dynamic gestures were used to represent hand gestures should have been able to replace all possible navigational commands. In order to identify the com- motion instructions. Defined hand gestures are given in Figure 3. mands used by wheelchair users during basic navigation, The gesture system was built based on the following a human study was conducted. 20 wheelchair users of age 55 to 70 have participated in the study. Participants considerations. were asked to guide their wheelchair using hand gestures (1) Defined hand gestures should be simple, clear, and or voice or multimodal interaction. Natural navigation accurate command identification was the priority. Hence, interac- tion method was not limited to hand gestures. Location (2) Gestures should be defined in a way that a user can is changed in order to cover all possible navigation scenar- navigate through a path using a minimum number ios. Participants did not have any prior knowledge of the of gestures locations or the previous study results. Hence, the accu- racy of the results was ensured, and repetition of results (3) A user should be able to remember and recall the was avoided. All possible navigation commands were defined hand gestures. To ensure user’s adjustabil- recorded, most frequent commands were identified, and ity to the gesture system, a human study was con- the graphical representation of the identified command ducted. Details of this study are explained in frequencies is given in Figure 2. Sections 4.1 and 4.2 Most frequent commands identified above were consid- (4) Significant difference should be identified among ered for the proposed gesture system. hand gestures. Therefore, users will not have any confusion with gestures 3.2. Human Study II: Hand Gesture Identification. A human study was conducted in order to understand the (5) A hand gesture system should have both static and hand gesture features used by wheelchair users for the dynamic gestures in order to mitigate inaccuracies identified navigational commands. A group of 20 people caused by the leap motion sensor randomly selected from the same age group (55 to 70) 3.4. Feature Extractions. Hand gestures accompanied with participated in the study. Participants were asked to exe- cute the basic navigation commands, identified in the vocal interaction tend to be both voluntary and involun- human study I using only the hand gestures. Data col- tary. These gestures carry information such as direction lected in this study were used to build the gesture system and motion. For a wheelchair user with vocal disabilities, that will be elaborated later. A leap motion sensor was these hand gestures could be considered as the primary modality. Even though there are gesture systems such as used to track hand gestures, and raw data collected through that were processed to identify the gesture fea- American sign language, the execution of these gestures tures. Most predominant hand feature associated in exe- differs from one elderly person to the other. To compen- cuting each command was recorded. Results are shown sate for this variation, bone angles as explained below in Table 1. Frequently used hand features for each gesture were used. Defined gesture system consists of two main forms of were used as a basis in feature extraction. Two main types of hand gestures were identified as gestures: dynamic gestures and static gestures. Static ges- static gestures (pointers or poses) and dynamic gestures tures are nonmoving hand poses which can be modeled (hand movements). Static gestures were mainly used in through basic hand features. Dynamic gestures are mod- subtle motion commands like Stop, Turn around, and eled using dynamic hand features like finger movement Turn slightly left/right. For vigorous motion commands, and hand movement. Applied Bionics and Biomechanics 5 Table 1: An analysis of hand feature frequencies associated with navigation commands. Fingers Navigation command Palm orientation Palm movement Fingertip movement Finger bones Single finger Multiple fingers Go forward 92% 32% 28% 6% 44% 56% Turn left 84% 75% 42% 18% 8% 48% Turn right 82% 75% 44% 16% 7% 52% Stop 96% 68% 64% 24% 18% 74% Turn around 98% 56% 86% 77% 14% 84% Slow down 98% 87% 54% 21% 19% 72% Turn slightly left 90% 34% 97% 42% 47% 52% Turn slightly right 89% 35% 98% 44% 46% 53% (a) (b) (c) (d) (e) (f) (g) (h) (i) (j) (k) (l) Figure 3: Navigation gestures: (a) Go forward, (b) Stop, (c) Go backward, (d) Hard left, (e) Hard right, (f) slightly left, (g) Slightly right, (h) Turn around, (i) Slow down, (j) Go faster, (k) Turn off, and (l) Turn on. (1) Palm orientation. Palm orientation was taken based be addressed by Quaternion angles. The main lim- on leap motion coordinates. The pitch angle, roll itation of using Euler angles is that difficulty in angle, and yaw angle of the palm depict the ori- interpolating between two orientations of an object entation. Pitch angle is the angle rotated around smoothly [29] the +Y axis, roll angle is the angle rotated around (2) Finger bone angles. Bone angles of fingers with the +X axis, and Yaw angle is the angle rotated respect to the metacarpal bone of the hand are around the +Z axis. As illustrated in Figure 4, the extracted. These angles are shown in Figure 5. Hence, Quaternion angle theory is used to take the yaw, even when (i) Slow down, (j) Go faster, (k) Turn off, pitch, and roll of the x, y, z vectors relative to a single and (l) Turn on, fingers have improper position that vector. Usually Euler angles are used as it has the will not affect the gesture recognition. As shown in ability to take the vectors relative to each other. Figure 5, the angles of distal (α), proximal (β), and But Euler angles have certain limitation that can 6 Applied Bionics and Biomechanics Mean fingertip Index fingertip velocity (F ) velocity (F) Mean bone angles Index bone angles (M1, (B1, B2, B3) M2, M3) Middle bone angles Roll angle (P1) (B4, B5, B6) Pitch angle (P2) Yaw angle (P3) Palm velocity (V1) umb bone angles (T1, T2) Figure 4: Features extracted from hand skeleton. +X +Z +Y → Figure 5: Hand features. intermediate (γ) bones with respect to the metacarpal fingers were considered specifically since most of bone were calculated using Equation (1). These the navigational gestures identified were associated angles were taken for the index finger and middle with them. As the ring finger and pinky finger finger. For the thumb finger, only the distal and are tightly associated couple, the average of distal, proximal, and intermediate angles was considered. proximal bone angles were taken as the thumb does not have an intermediate bone. These three Navigational gestures defined have sole ring or Applied Bionics and Biomechanics 7 F = fingertip velocity of Finger 1 F = fingertip velocity of Finger 2 F = mean value for fingertip velocity of Fingers 3 and 4 K = captured frame from a leap motion sensor n = frames taken from a leap motion sensor from 0.25 s intervals Require:: F , F , F, V 1 2 1 Ensure: : Activation Module for ‘K’ in n do if F > 0or F > 0orF > 0 and V > 0 then 1 2 1 Change state to waiting end if end for if F > 0or F > 0orF > 0 and V > 0 then 1 2 1 Activate DGIM Activate SCM else Activate SGIM end if Algorithm 1: Gesture Clarification Algorithm. pinky finger features. But it was important to get Require: : activate separate features for other three fingers as they Ensure: : DGIM activation were included separately in the hand gestures. if activate = 0 then Here, the direction of metacarpal bone, proximal DGIM =1 activate bone, intermediate bone, and distal bone is if gesture = ‘Turn off’ then ! ! ! ! denoted by p, q, r , and u, respectively. Activate waiting state DGIM =0 activate else Deactivate waiting state p = b − a, DGIM =1 activate end if ! ! q = c − b, end if ! ! r = d − c , ð1Þ Algorithm 2: DGIM Activation Algorithm ! ! u = e − d , u ⋅ p −1 α = cos : 3.5. Gesture Classification. Artificial Neural Networks u ⋅ p kk  (ANNs) have been developed to identify and clarify dynamic and static gestures. Each Static Gesture Identification Module (SGIM) and Dynamic Gesture Identification Module (DGIM) consist of an ANN. Gesture Clarifier (GC) consists The calculation of other two angles was done using the of Algorithms 1 and 2 to distinguish dynamic gestures from same approach static gestures. GC priorities dynamic gestures since critical (3) Fingertip velocity. To detect the dynamic gestures commands like “Turn off” and “Turn on” are defined in DGIM. It controls system state based on prioritized com- defined, fingertip velocity of the index finger was mands. If received navigation command was “Turn off,” the considered. Two different inputs were considered GC will isolate GI from DGIM and SGIM and wait for the for both magnitude and direction of the velocity next command to be “Turn on.” Moreover, when a gesture vectors. Also, mean fingertip velocity of other fin- confirmation is identified by GC and SCM, the appropriate gers was considered to detect finger movements. submodule will be activated. All the properties considered are shown in SGIM consists of an ANN that has 14 inputs (B1, B2, Figure 4 B3, B4, B5, B6, M1, M2, M3, T1, T2, P1, P2, and P3). (4) Palm velocity. To detect the palm movement of the There are two hidden layers in that ANN, and the output hand, palm velocity magnitude and direction were layer has four outputs (N1, N2, N3, and N4). The output considered as inputs. Palm orientation angles were of the SGIM represents a static navigation command number from 1 to 12. DGIM consists of an ANN that also input features to detect dynamic gestures 8 Applied Bionics and Biomechanics (a) (b) Figure 6: Research platform during experiment session. Table 2: Result of experiment I. has 5 inputs (C1, C2, Q1, Q2, P1, P2, P3, and V1) and 4 outputs (N5, N6, N7, and N8). The output of the DGIM Command Navigation Exp. Exp. Exp. Exp. represents a dynamic navigation command number from no. command 1 2 3 4 1 to 12. Both outputs of the SGIM and DGIM were in binary a Go forward 98% 99% 99% 99% numbers. Both ANNs use a sigmoidal function as the activa- b Stop 99% 99% 98% 100% tion function. c Go backward 98% 98% 97% 98% d Hard left 95% 97% 97% 94% 4. Result and Discussion e Hard right 95% 97% 97% 95% To evaluate the validity and accuracy of the proposed intelli- f Slightly left 94% 95% 98% 89% gent gesture system, experiments were carried out from two g Slightly right 93% 95% 98% 91% aspects: (a) accuracy of remembering and recalling of the h Turn around 95% 95% 98% 98% defined gestures and (b) accuracy and robustness of the intel- i Slow down 96% 97% 97% 99% ligent hand gesture recognition system. System was imple- j Go faster 95% 96% 98% 99% mented on the intelligent wheelchair explained in [20]. To k Turn off 100% 100% 98% 100% carry out the experiments, a group of participants of 20 l Turn on 99% 100% 100% 100% wheelchair users were randomly selected. They were selected from three age groups of 20 to 30, 30 to 55, and over 55 years. Participants were generally healthy with no cognitive impair- ments except for mobility impairment of legs. The research navigation command. Percentage accuracy of recalling the platform during the experiment session is shown in hand gesture for each navigation command was recorded Figures 6(a) and 6(b). as Exp. 1. In the next step, each participant was asked to The implementation of the developed intelligent sys- recall the navigation command for a randomly given hand tem requires a high-performing smart wheelchair with fast gesture. Percentage accuracy of recalling the navigation and reliable computing power. For this purpose, we used a command for each hand gesture was recorded as Exp. 2. wheelchair robot which is developed in our laboratory that After that, participants were asked to recall all the naviga- has basic navigational capabilities. In this wheelchair, we tion commands in one go. Percentage accuracy of recalling have installed an industrial grade high-end computer in a navigation command was recorded for this step as Exp. which DDR4 SO-DIMM memory is 32 GB and processor 3. Finally, each participant was given a fixed navigation is a 6th generation i7 quad core (3.6 GHZ). Also, to path, and they were asked to guide themselves with hand increase the computational capacity, a SSD memory of gestures defined in the system. Navigation path was 1 TB is installed. To compensate for the high performance selected considering all the navigation commands identi- ° ° and rugged operation, it can withstand from -20 Cto60 C fied. Percentage accuracy of remembering each gesture in temperature. These are essential for the intelligent system a task situation was recorded as Exp. 4. Recorded data is to work properly since training and execution will take a presented in Table 2. Boxplots given in Figures 7(a) and lot of computational power. 7(b) show the remembering and recalling capability of each dynamic and static hand gesture. 4.1. Experiment I. A detailed presentation of the navigation commands and relevant hand gestures was shown to each participant. They were asked to memorize the commands 4.2. Experiment II. Participants were given a specific naviga- and gestures for 15 minutes. Then, each participant was tion task to complete using hand gestures. Navigation path of asked to recall the relevant hand gesture for randomly given the task was planned in a way that all gestures were utilized. Applied Bionics and Biomechanics 9 a b c d e (a) fg h i j k l (b) Figure 7: Boxplots to represent remembering and recalling capability of each dynamic and static hand gesture. Table 3: Confusion matrix for identification of static gestures. ab c d e a 0.99 0.01 b 0.97 0.03 c 0.02 0.98 C G 12 m d 0.01 0.99 e 0.01 0.99 .95 confidence interval Observed Standard kappa error Lower limit Upper limit 0.9648 0.0145 0.9363 0.9933 ticular hand gesture are given in the confusion matrix given in Tables 3–5. 13 m In the experiment I, participants showed almost per- fect memory of basic navigation commands such as “Go Figure 8: Experiment II setup. Participants were asked to give gesture forward,”“Stop,”“Go backward,” and “Turn on/off” com- instructions for the path A > B > C > D > E > F > E > D > C > G. mands. Recalling accuracy percentage of most navigation commands was in the high 90s except for “Slightly right/- Navigation task and fixed path are given in Figure 8. Each left” commands. As mentioned in Table 2, Exp. 2 accura- participant had to guide themselves using the hand ges- cies are higher than Exp. 1. Therefore, it can be deduced tures, and the proposed system classified the hand ges- that recalling navigation command for hand gesture is tures. This process was repeated for all the participants. easier. Exp. 4 values are slightly lower than other accu- System recognition accuracies were recorded for each hand racy values. Recalling hand gestures during a task is gesture. Rates of success and failure in recognizing a par- tougher than in any situation. Overall, almost all accuracy 10 Applied Bionics and Biomechanics Table 4: Confusion matrix for identification of dynamic gestures. 5. Conclusions fg h i jkl This paper proposed a novel method to identify hand ges- tures related to navigation based on a gesture recognition f 0.95 0.01 0.04 model with compensations for user variances. An intelligent g 0.01 0.99 gesture identification system was introduced in order to h 0.04 0.96 0.01 clarify gestures with high precision. Bone angles with i 0.03 0.97 respect to metacarpal bone were introduced as novel fea- j 0.03 0.97 tures in order to elevate identification of gesture variances. k 1.00 The system is capable of eliminating complications due to l 1.00 user inability in executing precise hand gestures. An intelli- gent clarification system has been implemented to separate .95 confidence interval Observed Standard static and dynamic hand gestures. Experimental results kappa error Lower limit Upper limit confirmed that the wheelchair users with speech disabilities 0.9879 0.003 0.9819 0.9939 can remember and recall the proposed hand gesture system. Therefore, the proposed gesture model can be considered as user friendly, and it is concluded that the proposed intelli- Table 5: Confusion matrix for identification of dynamic and static gent gesture recognition system can recognize user hand gestures. gestures with a high accuracy. Static Dynamic Data Availability Static 24 1 25 Dynamic 2 23 25 The data used to support the findings of this study are 26 24 included within the article. Accuracy 0.94 Misclassification rate 0.06 Conflicts of Interest Precision 0.95 The authors declare that they have no conflicts of interest. Cohen’s kappa 0.88 (>0.81) Acknowledgments values are higher than 90% and for most critical gestures such Turn on/off has almost perfect recalling accuracy. This work was supported by the National Research Council Therefore, it can be proved that the proposed gesture sys- Grant Number 17-069 and the Center for Advanced Robotics tem is user friendly and easy to memorize. (CAR), University of Moratuwa. In the experiment II, three confusion matrices were cre- ated in order to validate the recognition accuracies. For the References two hand gesture recognition systems, static and dynamic, recognition accuracies were shown for each hand gesture. [1] S. Yusif, J. Soar, and A. Hafeez-Baig, “Older people, assistive For all the static gestures, accuracy is over 90% as technologies, and the barriers to adoption: a systematic review,” International Journal of Medical Informatics, vol. 94, shown in the confusion matrix given in Table 3. In the pp. 112–116, 2016. static gesture matrix, Cohen’s kappa value was calculated [2] R. Mostaghel, “Innovation and technology for the elderly: sys- with linear weighting. Used weights were equal in the tematic literature review,” Journal of Business Research, vol. 69, static confusion matrix. For all the dynamic gestures, rec- no. 11, pp. 4896–4900, 2016. ognition accuracy is over 90% and overall accuracy is [3] C. Aouatef, B. Iman, and C. Allaoua, “Multi-agent system in higher than the static gesture recognition system. Hence, ambient environment for assistance of elderly sick peoples,” the use of a high number of dynamic gestures than the in Proceedings of the International Conference on Intelligent static gestures for the system is validated. Cohen’s kappa Information Processing, Security and Advanced Communica- value was also calculated for this matrix as shown in tion, pp. 1–5, ACM, 2015. Table 4 with linear weighting. Critical dynamic gestures [4] M. Kavussanu, C. Ring, and J. Kavanagh, “Antisocial behavior, such as “Turn on/off” were weighted with two points moral disengagement, empathy and negative emotion: A com- and other gestures with one point. Since kappa values parison between disabled and able-bodied athletes,” Ethics & for both recognition systems are over 0.81, it can be Behavior, vol. 25, no. 4, pp. 297–306, 2015. proved that the systems are working properly. For the [5] M. Mast, M. Burmester, B. Graf et al., “Design of theˇ human- gesture type selection system, a confusion matrix was cre- robot interaction for a semi-autonomous service robot to assist ated and overall accuracy, misclassification rate, precision, elderly people,” in Ambient Assisted Living, pp. 15–29, and Cohen’s kappa values were calculated. Overall, accu- Springer, 2015. racy is 0.94 (>0.90) and kappa value is over 0.81. There- [6] M. A. V. Muthugala, P. H. D. A. Srimal, and A. G. B. Jayase- fore, it can be concluded that selection system is also kara, “Enhancing interpretation of ambiguous voice instruc- working properly. tions based on the environment and the user’s intention for Applied Bionics and Biomechanics 11 Fifth National Conference on Computer Vision, Pattern Recog- improved human-friendly robot navigation,” Applied Sciences, vol. 7, no. 8, p. 821, 2017. nition, Image Processing and Graphics (NCVPRIPG), pp. 1–4, IEEE, 2015. [7] M. Foukarakis, M. Antona, and C. Stephanidis, “Applying a multimodal user interface development framework on a [20] H. G. M. T. Yashoda, A. M. S. Piumal, P. G. S. P. Polgahapitiya, domestic service robot,” in Proceedings of the 10th Interna- M. M. M. Mubeen, M. A. V. J. Muthugala, and A. G. B. P. Jaya- tional Conference on PErvasive Technologies Related to Assis- sekara, “Design and development of a smart wheelchair with tive Environments, pp. 378–384, ACM, 2017. multiple control interfaces,” in 2018 Moratuwa Engineering Research Conference (MERCon), pp. 324–329, IEEE, 2018. [8] C. Georgoulas, A. Raza, J. Güttler, T. Linner, and T. Bock, “Home environment interaction via service robots and the [21] A. Chan, T. Halevi, and N. Memon, “Leap motion controller leap motion controller,” in Proceedings of the 31st Interna- for authentication via hand geometry and gestures,” in Inter- tional Symposium on Automation and Robotics in national Conference on Human Aspects of Information Secu- ConstructionISARC. rity, Privacy, and Trust, pp. 13–22, Springer, Cham, 2015. [9] J. Guerrero-García, C. González, and D. Pinto, “Studying user- [22] G. Canal, S. Escalera, and C. Angulo, “A real-time human- defined body gestures for navigating interactive maps,” in Pro- robot interaction system based on gestures for assistive scenar- ceedings of the XVIII International Conference on Human ios,” Computer Vision and Image Understanding, vol. 149, Computer InteractionACM. pp. 65–77, 2016. [10] P. Viswanathan, E. P. Zambalde, G. Foley et al., “Intelligent [23] Z. Ren, J. Meng, J. Yuan, and Z. Zhang, “Robust hand gesture wheelchair control strategies for older adults with cognitive recognition with kinect sensor,” in Proceedings of the 19th impairment: user attitudes, needs, and preferences,” Autono- ACM international conference on Multimedia, pp. 759-760, mous Robots, vol. 41, no. 3, pp. 539–554, 2017. ACM, 2011. [11] M. A. V. J. Muthugala and A. G. B. P. Jayasekara, “Interpreting [24] A. Kurakin, Z. Zhang, and Z. Liu, “A real time system for uncertain information related to relative references for dynamic hand gesture recognition with a depth sensor,” in improved navigational command understanding of service 2012 Proceedings of the 20th European signal processing confer- robots,” in Proceedings of the 2017 IEEE/RSJ International ence (EUSIPCO), pp. 1975–1979, IEEE, 2012. Conference on Intelligent Robots and Systems, pp. 6567–6574, [25] Y. Nam and K. Wohn, “Recognition of space-time hand- IROS, 2017. gestures using hidden Markov model,” in Proceedings of the [12] H. R. T. Bandara, M. V. J. Muthugala, A. B. P. Jayasekara, and ACM symposium on virtual reality software and technology, D. P. Chandima, “Grounding object attributes through inter- pp. 51–58, ACM, 1996. active discussion for building cognitive maps in service [26] J. Singha and R. H. Laskar, “Recognition of global hand ges- robots,” in Proceedings of the 2018 IEEE International Confer- tures using self co-articulation information and classifier ence on, Systems, Man, and Cybernetics (SMC)IEEE. fusion,” Journal on Multimodal User Interfaces, vol. 10, no. 1, [13] H. M. R. T. Bandara, M. A. V. Muthugala, A. G. B. P. Jayase- pp. 77–93, 2016. kara, and D. P. Chandima, “Cognitive spatial representative [27] J. Singha and R. H. Laskar, “Hand gesture recognition using map for interactive conversational model of service robot,” in two-level speed normalization, feature selection and classifier Proceedings of the ROMAN 2018 - IEEE International Confer- fusion,” Multimedia Systems, vol. 23, no. 4, pp. 499–514, 2017. ence on Robot and Human Interactive CommunicationIEEE. [28] Z. Lu, X. Chen, Q. Li, X. Zhang, and P. Zhou, “A hand gesture [14] M. V. J. Muthugala and A. B. P. Jayasekara, “Enhancing recognition framework and wearable gesture-based interaction humanrobot interaction by interpreting uncertain information prototype for mobile devices,” IEEE Transactions on Human- in navigational commands based on experience and environ- Machine Systems, vol. 44, no. 2, pp. 293–299, 2014. ment,” in Proceedings of the 2016 IEEE International Confer- ence on Robotics and Automation (ICRA), pp. 2915–2921, [29] N. H. Hughes, Quaternion to/from Euler angle of arbitrary IEEE, 2016. rotation sequence & direction cosine matrix conversion using [15] H. M. R. T. Bandara, B. M. S. S. Basnayake, A. G. B. P. Jayase- geometric methods, vol. 2, 2017. kara, and D. P. Chandima, “Cognitive navigational command identification for mobile robots based on hand gestures and vocal navigation commands,” in Proceedings of the 2nd Inter- national Conference on Electrical Engineering(EECon)IEEE. [16] E. Wästlund, K. Sponseller, O. Pettersson, and A. Bared, “Evaluating gaze-driven power wheelchair with navigation support for persons with disabilities,” Journal of Rehabilita- tion Research and Development, vol. 52, no. 7, pp. 815–826, [17] S. Pushp, B. Bhardwaj, and S. M. Hazarika, “Cognitive decision making for navigation assistance based on intent recognition,” in International Conference on Mining Intelligence and Knowl- edge Exploration, pp. 81–89, Springer, 2017. [18] D. Xu, X. Wu, Y. L. Chen, and Y. Xu, “Online dynamic gesture recognition for human robot interaction,” Journal of Intelligent & Robotic Systems, vol. 77, no. 3-4, pp. 583–596, 2015. [19] M. Sreejith, S. Rakesh, S. Gupta, S. Biswas, and P. P. Das, “Real- time hands-free immersive image navigation system using Microsoft Kinect 2.0 and Leap Motion Controller,” in 2015

Journal

Applied Bionics and BiomechanicsHindawi Publishing Corporation

Published: Jan 30, 2020

There are no references for this article.