Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Cartesian Control of Sit-to-Stand Motion Using Head Position Feedback

Cartesian Control of Sit-to-Stand Motion Using Head Position Feedback Hindawi Applied Bionics and Biomechanics Volume 2020, Article ID 1979342, 13 pages https://doi.org/10.1155/2020/1979342 Research Article Cartesian Control of Sit-to-Stand Motion Using Head Position Feedback 1 1 2 1 Samina Rafique , M. Najam-ul-Islam, M. Shafique, and A. Mahmood Electrical Engineering Department, Bahria University, Islamabad 44230, Pakistan Biomedical Engineering Department, Riphah International University, Islamabad 44230, Pakistan Correspondence should be addressed to Samina Rafique; samina.rafique@bui.edu.pk Received 6 February 2020; Revised 17 July 2020; Accepted 12 August 2020; Published 21 August 2020 Academic Editor: Raimondo Penta Copyright © 2020 Samina Rafique et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Sit-to-stand (STS) motion is an indicator of an individual’s physical independence and well-being. Determination of various variables that contribute to the execution and control of STS motion is an active area of research. In this study, we evaluate the clinical hypothesis that besides numerous other factors, the central nervous system (CNS) controls STS motion by tracking a prelearned head position trajectory. Motivated by the evidence for a task-oriented encoding of motion by the CNS, we adopt a robotic approach for the synthesis of STS motion and propose this scheme as a solution to this hypothesis. We propose an analytical biomechanical human CNS modeling framework where the head position trajectory defines the high-level task control variable. The motion control is divided into low-level task generation and motor execution phases. We model CNS as STS controller and its Estimator subsystem plans joint trajectories to perform the low-level task. The motor execution is done through the Cartesian controller subsystem that generates torque commands to the joints. We do extensive motion and force capture experiments on human subjects to validate our analytical modeling scheme. We first scale our biomechanical model to match the anthropometry of the subjects. We do dynamic motion reconstruction through the control of simulated custom human CNS models to follow the captured head position trajectories in real time. We perform kinematic and kinetic analyses and comparison of experimental and simulated motions. For head position trajectories, root mean square (RMS) errors are 0.0118 m in horizontal and 0.0315 m in vertical directions. Errors in angle estimates are 0.55 rad, 0.93 rad, 0.59 rad, and 0.0442 rad for ankle, knee, hip, and head orientation, respectively. RMS error of ground reaction force (GRF) is 50.26 N, and the correlation between ground reaction torque and the support moment is 0.72. Low errors in our results validate (1) the reliability of motion/force capture methods and anthropometric technique for customization of human models and (2) high-level task control framework and human CNS modeling as a solution to the hypothesis. Accurate modeling and detailed understanding of human motion can have significant scope in the fields of rehabilitation, humanoid robotics, and virtual characters’ motion planning based on high-level task control schemes. 1. Introduction tion, Centre of Mass (CoM), Centre of Gravity (CoG), and Center of Pressure (CoP) and kinetic variables like ground reaction forces (GRF), joint torques, and ground reaction Sit-to-stand (STS) movement is a skill that helps determine the functional level of a person. The ability to rise from sitting torques play an important role as feedback elements in STS to standing is critical to a person’s quality of life, as it is linked motion control [2]. with the functional independence of an individual. Studies on Of all sensory inputs, head position and orientation too the hierarchy of disability indicate that problem in STS starts are the area of researchers’ interest. There is ample clinical evidence that head position feedback to CNS plays a role in at a later stage than problems in walking commence [1]. Bio- mechanical literature is replete with studies aimed at deter- the control of human motion and maintenance of balance. mining various variables that contribute to STS motion. The study [3] shows that human motion control and mainte- Kinematic variables like joint positions, velocities, accelera- nance of balance by CNS rely on inputs from vision, 2 Applied Bionics and Biomechanics and rehabilitation devices. Custom human biomechanical proprioception, tactile/somatosensory, and vestibular sys- tems. The multisensory integration, combined with motion models are based upon Body Segment Parameter (BSP) control, undergoes both quick and slow alterations which values. Reimer et al. [13] have given an overview of methods available for the estimation of BSP. Weighing coefficient- are termed as fast and slow dynamics in CNS, respectively. For any voluntary motion, CNS anticipates set patterns of based methods are convenient but the error in results can inputs from multisensory systems. Vestibular sense, in con- be up to 40%. Geometric approaches are good (error less than junction with neck proprioception, estimates body orienta- 5%), but tedious as the number of body part measurements tion. The vestibular system senses linear and angular head can go even higher than 240. Medical imaging is also accurate (error < 5%) but needs expensive equipment in addition to motion, and the CNS uses this information for posture and gaze control [4]. A study in [5] suggested that visual percep- dangerous exposure to radiation. Among all these methods, tion played a role in balance control during STS. The role of marker-based motion capture system was reported as the head position feedback to CNS in smooth execution of STS is accurate one, despite its limitations in terms of cost, the need also studied in [6], and the dependence of the STS movement for a controlled environment, high sensitivity to noise, line of sight capture, etc. on the Centre of Mass (CoM) and head positions during the task is analyzed. To validate the modeling technique, the simulated To evaluate clinical hypotheses, computer simulations act motion profiles are compared with experimental results. In as a powerful tool. Human motion can be synthesized and [12], the proposed 6-link human model was checked for its analyzed in a simulation environment. Like all other accuracy using references from experimental data. The rela- tion of two kinetic variables, GRF and reaction moments, motions, the behavioral richness exhibited in natural human STS transfer results from a complex interplay of biomechan- was recorded from subjects and compared with the same ical and neurological factors [7]. An adequate understanding forces resulting in simulations [14]. The regression plots of of these factors is a prerequisite to understanding the overall two variables endorsed similarity between them during the mechanism of human STS motion as well as providing a gait cycle. The validation of the modeling scheme through the experimental result is also done in [8, 15, 16]. In [17], means for its synthesis. In a broader sense, basic constituents of the human motor system include biomechanical plants the researcher collected data of STS motion using infrared and CNS. Based on some specified task, CNS performs motor cameras and force plates and applied the data to a multiseg- ment biomechanical model for the analysis of the kinematic planning which culminates low-level control issued as a motor command to biomechanical plant. Some knowledge contribution of major body segments. Synthesis of human-like motion finds its application of biomechanical plant is also assumed to be encoded in CNS. Typically, CNS is modeled to serve only a limited func- both in simulation and physical settings: in computer tion. One possible model which is the most commonly used graphics, this leads to autonomously generating realistic motion for virtual characters. The intent is to direct these is the joint space control. It is possible to divide motion con- trol into the task generation phase and a motor execution virtual characters using high-level task for which low-level motion control is automatically generated. Similarly, the phase. This abstraction is more relevant to the design of engi- neered systems that augment physiological systems. Synthe- robotics community seeks a high-level control framework sis of human motion involves accurate reconstruction of for robotic systems [7]. In this study, we evaluate the clinical hypothesis that movement sequences, modeling of musculoskeletal kinemat- ics, dynamics, and actuation of segment joints [8]. Task- besides numerous other factors, CNS controls STS motion by tracking a prelearned head position trajectory. Motivated based methods used in robotics may be leveraged to provide novel musculoskeletal modeling methods and physiologically by the evidence [7] for a task-oriented encoding of motion accurate performance predictions. Advantage of robotic- by the CNS, we adopt a robotic approach for the synthesis of STS motion and propose this scheme as a solution to this based effort models frequently utilizes quantities that are derivable purely from skeleton kinematics and that are not hypothesis. We propose an analytical biomechanical human specific to muscle actuation. Since the evaluation of a system CNS modeling framework where the head position trajectory can be only as good as the model itself, the accuracy of the defines the high-level task control variable. We do extensive results primarily depends upon the quality of the human bio- motion and force capture experiments on human subjects to validate our analytical modeling scheme. To the best of mechanical model. Human STS is performed almost entirely in the sagittal plane [9]. Typically, human biomechanical our knowledge, this is the first study of STS motion and force models comprise a multilevel inverted pendulum, whose capture in the sagittal plane (2D). We used marker-based motion is governed by Euler-Lagrange equations. For motion optical motion capture system and force plate (1) to collect analysis and development of a control scheme, usually, an kinematic and kinetic data during this voluntary motion and (2) to realize a custom human biomechanical model in analytical model based on general physical parameters is realized. Such models and control schemes are extensively the sagittal plane as close as possible to real human beings. available in the literature on motion analysis [10, 11] and We first scale our biomechanical model to match the BSP the design of robotic devices [12]. To analyze a motion mech- values of the subjects. We do dynamic motion reconstruction anism more accurately, the simulated motion must be com- through the control of the simulated custom human CNS models to follow the captured head position trajectories in pared with actual human motion. For this purpose, custom biomechanical models are developed that are more useful real time. This study is the part of an ongoing study that is in the design and tuning of customizable motion assistance aimed at determining different variables involved in human Applied Bionics and Biomechanics 3 reference for respective models. Simulated motions STS motion. The previous work [11, 18, 19] comprised the analytical approach, and this work is based on experimental are analyzed and compared with experimental analysis of STS motion. This paper is organized as follows: motion. The materials and methods section should first, we provide the details of the analytical modeling frame- contain sufficient detail so that all procedures can work for STS motion synthesis followed by the experimental be repeated. It may be divided into headed subsec- setup and data collection of STS motion on human subjects. tions if several methods are described Next, we discuss the human biomechanical model scaling for custom human models. We simulate each subject’s STS 3. Analytical Modeling Framework motion and compare them with experimental results. Finally, 3.1. The General Human Biomechanical Model. A general we discuss the validity of the proposed design methodology four-link rigid body human model (as shown in Figure 2) is for its physiological relevance to the STS maneuver. used to simulate STS motion. The physiological parameters of the model (as shown in Table 1) have been borrowed from 2. Materials and Methods literature including our previous work [11, 18, 19, 21]. The model has three degrees of freedom (DoF). Four We design a biomechanical human CNS model (as shown in links include the foot, shank, thigh, and the upper body, Figure 1) to synthesize and control STS motion by tracking which we termed as a single link called Head-Arm-Trunk only head trajectory X as a reference and head position X (HAT). A triangular base of support represents the foot fixed as the only measurement. Since the reliability of the motion on the ground. Since the key movements of joints and limbs control is primarily linked with the accuracy of the human during STS take place in the sagittal plane only, we limit our biomechanical model, we compare experimental and simu- model to planar two-dimensional (2D) motions (in the Car- lated forces and fine-tune the model to reduce the error to tesian plane). All joints are revolute (hinge-like), and the a minimum. Hence, force measurement does not play a role model is an open-chain mechanism with three actuators at in motion synthesis or control, it is meant only for modeling each joint. θ , θ , and θ represent ankle, knee, and hip joint 1 2 3 scheme validation. positions, respectively. We refer the shank, thigh, and HAT as links l , l , and l , respectively. (X, Y) is the head position, 2.1. The Analytical Modeling Framework for STS Motion 1 2 3 and (x, y) is the hip position in Cartesian coordinates. ∅ is Synthesis. We develop an analytical human CNS modeling the head orientation in the World frame {W}. framework to generate STS motion. Our modeling scheme comprises the following steps: 3.2. Analytical Reference Trajectory. The model tracks a generalized head position trajectory, generated analytically (i) A general four-segment human biomechanical using an unforced state-space system borrowed from [22], model in the sagittal plane based on BSP from and modified accordingly. the literature [2, 9–11, 19–21] is realized in SimMechanics 3.3. CNS Modeling: STS Controller Design. The CNS is modeled as STS controller comprising two subsystems: an (ii) We analytically generate head trajectory [22] to be Estimator and a Cartesian controller. used as the reference 3.3.1. Estimator. The estimation of joint angles is based on (iii) We design the STS controller to emulate human the inverse kinematics of the human biomechanical model. CNS, capable of (a) estimating joint angles using inverse kinematics based on head position measure- (1) Forward Kinematics (FK) Analysis. Forward kinematics ments X and (b) generating joint actuation torque maps joint space (θ ) into Cartesian space (x, y, ϕ) [23], commands τ by Cartesian control based on head where ϕ is the orientation of a point in the Cartesian plane position error δX with respect to the World reference {W}. To determine the head position (X, Y), the set of kinematic equations is 2.2. Experimental Validation of Modeling Scheme given as (i) The physical parameter data collected from 7 sub- jects are converted into BSP using the weighing coef- X = l c + l c + l c , ð1Þ 1 1 2 12 3 123 ficient method of anthropometry. BSP values are used to realize custom/subject-specific human bio- ð2Þ Y = l s + l s + l s , 1 1 2 12 3 123 mechanical models where c stands for cos ðθ Þ, c for cos ðθ + θ Þ, s for 1 1 12 1 2 1 (ii) We capture motion and force data of STS maneuver sin ðθ Þ, and so on. Also, from subjects using multiple infrared cameras and passive reflective markers. We extract custom head ϕ = θ + θ + θ , ð3Þ 1 2 3 trajectories from motion data and torques and ground reaction forces (GRFs) from force data where ϕ is the orientation of HAT (or head) with respect (iii) STS motion is reconstructed for each custom human to the x-axis. The generalized coordinate is a compact CNS model. Custom head trajectories are used as the notation p = ½X, Y, ϕ. 4 Applied Bionics and Biomechanics Demographic BSP data Anthropometric data conversion STS control Ref head position, Marker Inv Kin/ Human Kinematic Error velocity, orientation data Cartesian biomechanical variable 𝜒 − 𝜒 𝜏 controller model conversion d Measured head pos, velocity, orientation Measured Forceplate Experimental Kinetic data torque/GRF torque/GRF variable + − conversion Error Figure 1: Workflow of STS motion control scheme. 3.3.2. Cartesian Control. Cartesian control refers to the posi- tion control of the head, following a required trajectory in Cartesian space. XY (1) Equation of Motion. Dynamic equation of motion of the human biomechanical model in joint space is given by € _ τ = MðÞ θ θ + V θ, θ + GðÞ θ , ð9Þ {H} xy {T} € _ where θ, θ , and θ are n ×1 joint angular acceleration, veloc- I ity, and position vectors, respectively. MðθÞ is the n × n iner- {S} 𝜃 _ 1 tia matrix of the model, Vðθ, θÞ is n ×1 vector of centrifugal {F} x-axis {W} and Coriolis terms, GðθÞ is the n ×1 vector of gravity terms, and τ is the n ×1 torque vector. Modifying dynamic equation Figure 2: Three DoF biomechanical human model defined in the from joint space to Cartesian space [23], body frame. {S}, {T}, and {H} represent shank, thigh, and HAT frames for segments l , l , and l . 1 2 3 € _ F = M ðÞ θ X + V θ, θ + G ðÞ θ , ð10Þ x x x (2) Inverse Kinematics (IK) Analysis. To estimate joint angles, the IK problem is solved. First, p is used to find a unique hip where F is the appropriate force-torque vector, and X is the position (x, y) to reduce the problem at hand from four to position and orientation of the head in Cartesian space. M three links. To find hip position (x, y), hip joint angle con- ðθÞ is the mass-inertia matrix in Cartesian space and so on. straint, i.e., 0 ≤ θ ≤ π is imposed. The solution then sim- plifies A trajectory conversion process thus required x = X + l c π − ϕ , ð4Þ ðÞ θ = inv kin X , ð11Þ ðÞ d d y = Y − l s π − ϕ : ð5Þ ðÞ where X is the desired head position trajectory in Cartesian Using algebraic manipulation, the three joint angles space, and θ is the vector of corresponding joint angles. The inferred from head position are inv kin operator refers to the inverse kinematic procedure used for the inference of joint angles from the position of ð6Þ θ = atan2ðÞ s , c , 2 2 2 the end effector. θ = atan2 y, x − atan2 l s , l + l c , ð7Þ ðÞ ðÞ ðÞ 1 2 2 1 2 2 (2) Transpose Jacobian Control. In this scheme, measured position X is compared to desired position X to form an θ + θ + θ = atan2ðÞ sϕ, cϕ = ϕ, ð8Þ 1 2 3 error δX in Cartesian space. The error vector is then applied where atan2 is the MATLAB command for four-quadrant to control law to compute the Cartesian force vector, F, -1 tan with arguments in bracket representing vertical and which is that fictitious force if applied at the head will tend horizontal components of the position vector. to reduce Cartesian error. The Cartesian force vector is then y-axis Applied Bionics and Biomechanics 5 Table 1: BSP data for analytical biomechanical model [9]. Segment Mass (kg) Length/height (m) Center of gravity (m) Moment of inertia (kg m ) Foots 1.91 0.07 —— Shanks 6.14 0.43 0.25 0.11 General human model Thighs 13.20 0.43 0.25 0.26 HAT 44.75 0.83 0.31 7.53 mapped into joint torque vector τ using transpose Jacobian (5) Static Forces in the Human Model. Forces and moments conversion. propagate from segment to segment. Torques on joints must be applied to keep the system in static equilibrium. Jacobian (3) The Velocity of the Head. Description of angular velocity (J) in force domain maps force on the head into torques ω of link i +1 with respect to respective frame is given by on joints: iþ1 iþ1 i iþ1 _ ̂ ω = R ω + θ Z , ð12Þ i+1 i i i+1 i+1 T τ = J F, ð19Þ where i = 0,1,2 refers to link number, R is a rotation matrix, and Z is the axis of joint rotation. The linear velocity v is where F is the Cartesian force required to act on the given by head. iþ1 iþ1 i i i v = R v + ω × P , ð13Þ i+1 i i i i+1 (6) Cartesian Control Law Design. The control scheme is based upon the hypothesis that the feedback of head position where P is the head position vector. For the model shown in X to CNS, i.e., the STS controller, plays a role in carrying out Figure 3, the angular and linear velocity components of the STS motion. As shown in Figure 4, using the measured posi- head in three axes are given, respectively: tion of head X and comparing with desired/reference head 2 3 trajectories X , the CNS generates error signal δX. From 6 7 head position measurements, the Estimator part of CNS 3 2 6 7 − ω = ω = , ð14Þ 3 2 infers joint positions (θ ), required to reduce the error δX. 4 5 Similarly, the head position errors fed back to CNS generate _ _ θ + θ 1 2 torque command to the joints using the Cartesian control 2 3 law. Since Cartesian control is usually implemented in force l s θ 1 2 1 domain, the controller generates a force command F. Then, 6 7 6 7 the transpose Jacobian converts force command F into tor- _ _ _ v = : ð15Þ 6 l c θ + l θ + θ 7 1 2 1 2 1 2 4 5 que command τ, for joint actuation. 4. Validation Framework To find these velocities with respect to fixed foot-shank The human CNS modeling scheme to synthesize STS motion frame {F}, using the rotation matrix is designed in a purely analytical framework. To validate our 2 3 modeling framework and the hypothesis that CNS control of c −s 0 12 12 STS motion has a dependence on head position trajectory 6 7 0 0 1 2 6 7 R = R R R = , ð16Þ s c 0 feedback, we must check the model for its ability to replicate 3 3 2 3 12 12 4 5 experimental STS motion using custom/subject-specific 00 1 models. A comparison of simulations and experimental find- ings will be the basis of the validity of our control framework. 2 3 _ _ _ −l s θ − l s θ + θ The second phase of our study starts from scaling our analyt- 1 1 1 2 12 1 2 6 7 ical human model to custom models. 6 7 0 3 6 7 v = R v = : ð17Þ _ _ 3 3 3 6 l c θ + l c θ + θ 7 1 1 1 2 12 1 2 4 5 4.1. Subject’s Physical Parameters and Anthropometric Conversion. Experimental data of sit-to-stand transfer were collected at Biomechanics Lab of Riphah International (4) The Jacobian. Jacobian is a nonlinear time-varying matrix University. Seven healthy subjects (five males and two that relates joint angular velocities to linear head velocity: females, age: 22 ± 0:81 years, mass: 72:58 ± 11:61 kg, height: 1:70 ± 0:04 m) were selected for data collection of sit-to- "#"# stand motion. The subjects had no history of movement −l s − l s −l s 1 1 2 12 2 12 0 0 v = JðÞ θ θ = : ð18Þ disorder. They provided their informed consent under the l c + l c l c 1 1 2 12 2 12 2 Ethics Committee of Riphah International University. 6 Applied Bionics and Biomechanics (a) (b) (c) (d) Figure 3: Motion capture equipment by OptiTrack. (a) Calibration square, (b) calibration wand, (c) infrared camera, and (d) the Pasco force plate for force data capture. Reference Measured Torque head Force head Human Error command trajectories command trajectories Cartesian Transpose biomechanical 𝜒 controller Jacobian 𝜏 𝜒 𝜒 −𝛿 󰫭 model Joint angles Inverse kinematics Figure 4: STS control scheme to emulate CNS. Subjects’ physical parameter data (as shown in Table 2) able. Of these, marker-based motion capture is termed as are used to calculate BSP. An extensive literature is available one of the most accurate methods. To measure ground reac- tion forces, a force platform has been used. These methods on methods of anthropometric conversion. Among the vari- ous methods available in the literature [13], we have used the are extensively used in literature for modeling and analysis method of weighing coefficient [14], which is widely accepted of biomechanical motion mechanisms [2, 15, 16]. For a among the research community. For brevity, only one repre- detailed description of our experimental work, refer to [21] sentative data out of a total of 7 subjects is presented in 4.2.1. Experiment Protocol. Subjects completed the STS task Table 3. using an armless chair 49 cm from the force-plate (as shown 4.2. STS Motion and Force Capture. Reference [24] provides in Figure 5). To collect the data in the sagittal plane, three comprehensive coverage of motion capture methods avail- spherical reflective markers on the left side of each segment, Applied Bionics and Biomechanics 7 system. Before each trial, we checked the force plate for Table 2: Subjects’ physical parameter data. zero error. Subject ID Gender Age (year) Mass (kg) Height (m) 4.2.3. Data Collection and Analysis Tools. Each marker was 1 Male 21 76.55 1.69 manually numbered in the captured data file. Markers were 2 Male 22 79.81 1.70 then grouped into segments. Segment labels, too, were 3 Male 21 50.05 1.69 assigned manually in Motive Edit mode for each trial. Motive 4 Female 22 66.56 1.61 2.0.1 generates motion capture data in .tak and .c3d file 5 Female 22 84.91 1.67 formats. For data analysis, we have used a motion capture 6 Male 23 71.05 1.72 software MoCap, a freely available motion data analysis tool- 7 Male 23 79.10 1.78 box that works seamlessly with Matlab. Force-plate data is collected from four-beam setup, that provides vertical and horizontal forces generated under both feet during STS. Force data is recoded in .cap format and exported into excel Table 3: BSP data based on the subject’s physical parameters. .csv format for analysis. Center of Moment of Subj Mass Length/height 4.2.4. MoCap Data Analysis. Motion data in .c3d format was Segment gravity inertia ID (kg) (m) imported into the Matlab MoCap toolbox for analysis. (m) (kg m ) Marker positions were converted into joint positions. Then, Foots 2.22 0.066 —— angular positions of each joint in every frame were calcu- Shanks 7.11 0.419 0.237 0.114 lated. Similarly, the head position trajectory was constructed Thighs 15.31 0.417 0.236 0.278 using a marker on the head. Marker data and joint data were used to animate the STS transfer of the subjects. HAT 51.90 0.801 0.299 8.199 Data on subject # 5 were corrupted and hence were rejected. Figure 6(a) shows the ensemble average of head i.e., foot, shank, thigh, and trunk, were attached. Since position; Figure 6(b) shows the ensemble average of ankle, markers pose problems in the segment and joint position knee, and hip joint trajectories; and Figure 6(c) shows the assessment due to skin or loose garment artifacts, a set of ensemble average of ground reaction force of all six subjects. markers on each segment were applied using rigid rulers. Standard deviation curves in dashed lines show the magni- One marker was attached on top of the head using a hair- tude of intrasubject variation. band. Motion capture was done using four infrared Flex 3 4.3. STS Motion Control for Custom Models. We reconstruct cameras by OptiTrack. The data were recorded at 100 Hz STS motion using a custom human analytical STS controller using the OptiTrack Motive 2.0.1 software. Force data were framework. Subject-specific head position trajectories extracted recorded at the same time using a 2-axis 4-beam Pasco force from motion capture data are used as the reference. plate. The force data were captured at 100 Hz using the Capstone software. Each subject completed multiple STS trials. All trials were 5. Results done at once. Each trial began with the subject seated in the 5.1. Simulations. The ensemble averages of all motion and chair, arms crossed across the chest. The trial started with a force data obtained from the experiment and simulations verbal command of “stand,” and then, data were recorded are calculated and compared. The plots of the kinematic for approximately 4 sec. After this, the subject was again variable are shown in Figure 7 through Figure 8. asked to be seated and then trial repeated. 6. Discussion 4.2.2. Equipment and Calibration. To the best of our knowl- edge, this is the first study of STS motion capture in the sag- In this study, we propose a modeling and motion control ittal plane (2D); hence, there are no definite rules available in solution to evaluate the clinical hypothesis that besides the literature about the appropriate positions and number of numerous other factors, CNS controls the STS motion by markers placed on the body segments. Neither any research tracking a prelearned head position trajectory. CNS com- suggests an optimum number of cameras for reliable motion pares this anticipated head motion pattern with actual head capture. Literature, in general, is about 3D motion capture position measured by vestibular, proprioception, and vision [24, 25]. We have, therefore, opted for a multiple-camera sys- senses. Based on the head position error, CNS generates tor- tem, along with spherical markers to ensure better visibility que commands for joints actuation so that a smooth STS and reliable data reconstruction by the system. Cameras were motion may result. Motivated by the evidence for a task- arranged such that complete coverage of motion area could oriented encoding of motion by the CNS [7], we present a be ensured. Camera calibration using “calibration wand” human CNS modeling scheme to synthesize and control and determination of frame of reference for motion capture STS motion using an analytically generated head position area using “calibration square” were done before motion cap- trajectory in a high-level task control framework. ture started. We used the 2-axis Pasco force platform for First, we realize a 4-segment 3-DoF analytical human force data capture at 100 Hz in the Capstone data acquisition biomechanical model based on anatomical proportions [9] 8 Applied Bionics and Biomechanics (a) (b) Figure 5: STS data capture setup: (a) a subject with markers affixed on segments. The feet are placed on force-plate, (b) motion capture view and cameras in Motive 2.0.1 environment. Average head position trajectories from experiment Average joint trajectories from experiment 0 0.2 2 1.5 −0.1 1.5 −0.2 −0.2 −0.4 0.5 −0.6 −0.3 −0.8 0.5 −0.4 −1 −0.5 −1.2 −0.5 0 020 40 60 80 100 0 50 100 0 50 100 0 50 100 % STS cycle % STS cycle (a) (b) Average vertical GRF from experiment 0 10 20 30 40 50 60 70 80 90 100 % STS cycle (c) Figure 6: Ensemble average trajectories of (a) head position, (b) joint angles, and (c) GRF using motion capture. Curves in dashed lines represent ±1 standard deviation (SD). Head positions X(m), Y(m) Body weight (N) Ankle (rad) Knee (rad) Hip (rad) Applied Bionics and Biomechanics 9 Errors between experimental and simulated head position of the STS cycle of all trials, and both the motion and force 0.25 data were trimmed and normalized for % STS cycle. The marker data were then converted into six joints data (as 0.2 shown in Figure 9(b)) which closely resembles the analytical model depicted in Figure 2. Experimentally generated head 0.15 position trajectories in Figure 6(a) closely resemble the ana- lytically generated general head position trajectory in 0.1 Figure 10. The motion was then reconstructed in control and simulation framework by tracking the head marker 0.05 trajectories in real time. Figure 7 gives a comparison of experimental and simu- lated head position trajectories in horizontal (X) and vertical (Y) directions. The Cartesian control part of the STS control- −0.05 ler provides appropriate joint torques to minimize head posi- −0.1 tion error δX. The RMS error for X =0:0118 m and for 0 20 40 60 80 100 Y =0:0315 m. This shows very good tracking of reference % STS cycle input X by the STS controller. Experimental, estimated, and simulated joint angles are plotted in Figure 11. Estimated Horizontal position error and simulated joint angles are compared with experimental Vertical position error joint angles. RMS error for ankle = 0:55 rad (estimation), 0.54 rad (simulation), for knee = 0:93 rad (both), and for Figure 7: Comparison of ensemble average head position trajectories hip = 0:59 rad (both). The joint angle errors are relatively from motion capture experiments and simulations. RMS error for horizontal position X =0:0118 m and for Y =0:0315 m. high and attribute to the use of the same controller for a vari- ety of custom human models and head position trajectories that exhibit relatively large intrasubject variations. The joint in the sagittal plane. We realize the CNS model as an STS angle error can be reduced significantly if (1) the controller is tuned for each custom model and (2) simulation is run controller having two subsystems: an Estimator to automati- cally plan joint level motions and the Cartesian controller to with subject-specific initial conditions. Another reason for larger joint angle errors attributes to the fact that the STS generate appropriate joint torque commands to reduce head control strategy is based on head position tracking, and there position error. Our previous work [11, 18, 19] and some work from the are no joint position reference inputs and measurements being used. This is evident from small errors between exper- literature [2, 9, 10, 20, 26] were based on the same analytical imental and simulated head position trajectories; Figure 8 human model (realized in mathematical or simulation plots head orientation curves ∅, measured from experiments frameworks) using different combinations of measurements, and simulations. A small RMS error of 0.0442 rad for head feedbacks and controllers. We did the analytical design in the orientation shows good estimation and tracking of the head first phase to relate and compare our current study with the trajectory by the controller. Kinetic variables are plotted previous work. Using a well-defined human model and sim- and analyzed next. Figure 12 shows how the force F exerted ulation results from previous studies helped us design and by the bodyweight during STS changes. At the start of the fine-tune the STS controller that could produce comparable STS cycle, the initial force of 200 N shows the average weight results. As a standard procedure [8, 15, 16, 21], we validated of the two feet, shanks, and partially of thighs, while seated. our modeling and control scheme framework with laboratory With seat off, the weight on the force plate increases and so data as well. does the vertical component of the ground reaction force. Physical parameters data of the 7 subjects (as shown in The GRF measured from simulations is plotted as F . The Table 2) are converted into BSP values using the weighing coefficient method of anthropometry. BSP values in Table 3 two forces match closely (RMS error 50.26 N only) and settle are used to scale customhuman models to match the anthro- to the final value of the subject’s average weight. Support pometry of the subjects. moment M is the sum of ankle, knee, and hip joint torques. We capture experimental kinematic data of STS motion Ground reaction torque is a function of ankle joint torque in the sagittal plane using four OptiTrack Flex-3 cameras [26]. We have found that a relatively high correlation (0.72) and thirteen spherical reflective markers on four segments exists between ground reaction moment M and the support of each subject. Kinetic data were collected at the same time moment M as can be seen in Figure 13. The low RMS errors using the Pasco force-platform underneath both feet of the between experimental and simulated measurements validate subjects. The marker data were recorded in the OptiTrack our modeling framework. Figure 14(a) depicts snaps from Motive environment and then imported and analyzed using the animation of experimental STS. Figure 14(b) shows STS MoCap and MATLAB. The motion was reconstructed from motion phases from simulation, based on the customized marker data (as shown in Figure 9(a)). The animated motion human model in SimMechanics. The close resemblance helped check the data for missing markers and frames. between the animation of experimental data and simulation The missing data were reconstructed using interpolation. shows the good quality of STS motion control which attri- The animation also helped determine the start and end butes to (1) robust design of analytically developed STS Error (m) 10 Applied Bionics and Biomechanics Average head orientation from experiment and simulations 1.6 1.55 1.5 1.45 1.4 1.35 1.3 1.25 0 10 20 30 40 50 60 70 80 90 100 % STS cycle exp sim Figure 8: RMS error =0:0442 rad for head orientation ∅, obtained from the average of experimental data and measurements from simulations. (a) (b) Figure 9: STS transfer phases with motion trajectories from animation based on (a) marker data and (b) joint data. controller to model CNS, (2) reliability of experimental data General head position trajectories capture techniques employed, and (3) low error factor in BSP conversion from weighing coefficient method to obtain customized human biomechanical models. 1.5 6.1. Assumption and Limitations. The subjects’ physical parameters were converted into complete set of BSP using weighing coefficient method, which is a mathematical 0.5 method of anthropometery. Despite the risk of high error in estimation [13], this method is widely accepted in research community due to its convenience as compared to other methods that need special equipment for body segment mea- −0.5 surements. The estimation error, however, leads to modeling 020 40 60 80 100 error that becomes a source of mismatch in experimental and % STS cycle simulation results. Moreover, there is a lack of protocols for motion capture in 2D. We devised a set of protocols for this gen experiment which we kept modifying until a satisfactory level gen of reliable results was achieved. There were some limitations Figure 10: Analytically generated general head position trajectory. associated with experimental equipment as well: (1) we did General head position (m) Head orientation (rad) Applied Bionics and Biomechanics 11 Average joint angles from experiment, simulation and estimation 0.2 0.5 1.6 1.4 0 0 1.2 −0.2 −0.5 0.8 −0.4 −1 0.6 −0.6 −1.5 0.4 0.2 −0.8 −2 −1 −2.5 −0.2 −1.2 −0.4 −3 0 50 100 0 50 100 0 50 100 % STS cycle Experiment Simulation Estimation Figure 11: Comparison of average experimental joint trajectories with estimated and simulated trajectories. RMS error for ankle = 0:55 rad (estimation), 0.54 rad (simulation), for knee = 0:93 rad (both), and for hip = 0:59 rad (both). Average GRF from experiment and simulations Another assumption was made by using same motion con- troller for all subject specific human biomechanical models. 700 Further improvement in work could be made if controller were tuned separately for each scaled model. For this study, our modeling scheme was based on rigid body segments; such assumption leads to modeling error of the systems like human body that are not exactly rigid. 7. Conclusion A modeling framework to evaluate the role of head position trajectory in physiologically relevant STS motion control by the CNS is presented. A robotic approach for the synthesis 0 204060 80 100 of STS motion using task-level control is utilized. We % STS cycle mapped a scaled dynamic human model to the human sub- F jects’ anthropometric values and simulated STS motion by F tracking head position trajectories in real time. The study contributes to the knowledge base by proposing a system that Figure 12: Average ground reaction force curve F , measured by (1) synthesizes human motion using a high-level task control the force platform, showing the trajectory of body weight variation framework, for which low-level motion control is automati- during STS by the subjects. F shows the same variable measured cally generated and (2) validates a 2D biomechanical model- during subject-specific simulations. The RMS error between the ing scheme based on the weighing coefficient method for two curves = 50:26 N. inference of Body Segment Parameter (BSP). The modeling scheme is validated using kinematic and kinetic analyses of simulated and captured motion and force data of real sub- not have specialized skin tight garments for subjects. Since jects. The analytically designed STS controller is robust markers pose problems in the segment and joint position enough to simulate real subjects’ STS motion. Low errors assessment due to skin or loose garment artifacts, a set of between experimental and simulated motions not only prove markers on each segment were applied using rigid rulers. the validity of the modeling framework but support the (2) The motion capture equipment and force plate were not clinical hypothesis that there exists a role of head position synchronized electronically; the two variables were visually measurement feedback to CNS in controlling a smooth analyzed from captured data for time synchronization. STS motion. Body weight (N) Ankle (rad) Knee (rad) Hip (rad) 12 Applied Bionics and Biomechanics Average ground reaction torque and support moment −500 −1000 −1500 −2000 −2500 0 10 20 30 40 50 607080 90 100 % STS cycle Figure 13: Average ground reaction torque M and support moment M (sum of joint torques). The two variables’ correlation = 0:72. z s (a) (b) Figure 14: (a) Phases of STS from motion capture. Trajectories of joints are also shown. (b) Simulated STS motion in SimMechanics environment. Torque (Nm) Applied Bionics and Biomechanics 13 in Basic In the future, we want to modify the human biomechan- & Clinical Pharmacology & Toxicology. Vol. 124, Wiley, NJ USA, 2019. ical modeling scheme from rigid body kinematics to account for elastic body links to better match subject-specific anthro- [12] M. Geravand, P. Z. Korondi, C. Werner, K. Hauer, and A. Peer, “Human sit-to-stand transfer modeling towards intuitive and pometry. Our hypothesis and findings can be further gener- biologically-inspired robot assistance,” Autonomous Robots, alized to all kinds of human motion syntheses like walking vol. 41, no. 3, pp. 575–592, 2017. and stair climbing. [13] R. Riemer and E. T. Hsiao-Wecksler, “Improving net joint tor- que calculations through a two-step optimization method for Data Availability estimating body segment parameters,” Journal of biomechani- cal engineering, vol. 131, no. 1, article 011007, 2009. Readers can request the corresponding author for motion [14] D. A. Winter, Biomechanics and motor control of human and force capture datasets. movement, John Wiley & Sons, 4th edition, 2009. [15] E. J. Caruthers, J. A. Thompson, A. M. W. Chaudhari et al., Conflicts of Interest “Muscle forces and their contributions to vertical and horizon- tal acceleration of the center of mass during sit-to-stand trans- The authors declare that there is no conflict of interest fer in young, healthy adults,” Journal of Applied Biomechanics, regarding the publication of this paper. vol. 32, no. 5, pp. 487–503, 2016. [16] M. K. Cullen, Muscle-driven simulations of sit to stand transfer in persons with severe osteoarthritis, [M.S. thesis], The Ohio Acknowledgments State University, 2015. APC is may be covered by Bahria University. [17] I. Y. Campos Padilla, Biomechanical analysis of the sit-to-stand transition [Ph.D. thesis], School of Mechanical, Aerospace and Civil Engineering, 2016. References [18] S. Rafique, M. Najam-l-Islam, and A. Mahmood, “Synthesis of sit-to-stand movement using SimMechanics,” Proceedings of [1] R. C. van Lummel, Assessing sit-to-stand for clinical use, 2017, the 1st International Conference on Smart Innovation, Ergo- https://research.vu.nl/en/publications/assessing-sit-to-stand- nomics and Applied Human Factors (SEAHF). SEAHF 2019. for-clinical-use. Smart Innovation, Systems and Technologies, vol 150, C. Bena- [2] A. M. Mughal and K. Iqbal, “Optimization of biomechanical vente-Peces, S. Slama, and B. Zafar, Eds., Springer, Cham, STS movement with linear matrix inequalities,” International Journal of Mechatronics Systems and Control, vol. 47, no. 1, [19] S. Rafique, A. Mahmood, and M. Najam-ul-Islam, “Robust pp. 1–11, 2019. control of physiologically relevant sit-to-stand motion using [3] R. Chiba, K. Takakusaki, J. Ota, A. Yozu, and N. Haga, reduced order measurements,” Proceedings of the Future Tech- “Human upright posture control models based on multisen- nologies Conference (FTC) 2018. FTC 2018. Advances in Intel- sory inputs; in fast and slow dynamics,” Neuroscience research, ligent Systems and Computing, vol 881, K. Arai, R. Bhatia, vol. 104, pp. 96–104, 2016. and S. Kapoor, Eds., Springer, Cham, 2019. [4] P. A. Forbes, G. P. Siegmund, A. C. Schouten, and J.-S Ã.©b. [20] M. Mughal and K. Iqbal, “A fuzzy biomechanical model for Blouin, “Task, muscle and frequency-dependent vestibular H suboptimal control of sit-to-stand movement,” Interna- control of posture,” Frontiers in Integrative Neuroscience, ∞ tional IASTED Conference on Intelligent Systems and Control, vol. 8, p. 94, 2015. [5] A. Siriphorn, D. Chamonchant, and S. Boonyong, “The effects [21] S. Rafique, M. N. Islam, M. Shafique et al., “Position driven sit- of vision on sit-to-stand movement,” Journal of Physical Ther- to-stand simulation using human body motion and force apy Science, vol. 27, no. 1, pp. 83–86, 2015. capture,” in 2019 22nd International Multitopic Conference [6] J. P. Scholz, D. Reisman, and G. Schöner, “Effects of varying (INMIC), Islamabad, Pakistan, Pakistan, November 2019. task constraints on solutions to joint coordination in a sit-to- [22] A. M. Mughal and K. Iqbal, “Synthesis of angular profiles for stand task,” Experimental Brain Research, vol. 141, no. 4, bipedal sit-to-stand movement,” in 2008 40th Southeastern pp. 485–500, 2001. Symposium on System Theory (SSST), pp. 293–297, New [7] D. S. Vincent and R. Chen, “A task-level biomechanical frame- Orleans, LA, USA, 2008 March. work for motion analysis and control synthesis,” Human Mus- [23] J. J. Craig, Introduction to robotics: mechanics and control,vol. 3, culoskeletal Biomechanics, 2012. Pearson/Prentice Hall, Upper Saddle River, NJ, USA, 2005. [8] O. Khatib, E. Demircan, V. de Sapio, L. Sentis, T. Besier, and S. Delp, “Robotics-based synthesis of human motion,” Journal [24] E. van der Kruk and M. M. Reijne, “Accuracy of human of Physiology-Paris, vol. 103, no. 3-5, pp. 211–219, 2009. motion capture systems for sport applications; state-of-the- art review,” European Journal of Sport Science, vol. 18, no. 6, [9] K. Iqbal and Y. C. Pai, “Predicted region of stability for balance pp. 806–819, 2018. recovery: motion at the knee joint can improve termination of forward movement,” Journal of Biomechanics, vol. 33, no. 12, [25] A. Bilesan, M. Owlia, S. Behzadipour et al., “Marker-based pp. 1619–1627, 2000. motion tracking using Microsoft Kinect,” IFAC-PapersOnLine, [10] M. A. Mahmood and K. Iqbal, “Physiological LQR design for vol. 51, no. 22, pp. 399–404, 2018. postural control coordination of sit-to-stand movement,” [26] A. M. Mughal and K. Iqbal, “Experimental analysis of kinetic Cognitive Computation, vol. 4, no. 4, pp. 549–562, 2012. variables for biomechanical sit to stand movement,” in 34th [11] S. Rafique, M. Najam-ul-Islam, and A. Mahmood, “Sit-to- Annual Meeting of American Society of Biomechanics, Provi- stand motion control using head position feedback to CNS,” dence, RI USA, 2010. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Applied Bionics and Biomechanics Hindawi Publishing Corporation

Cartesian Control of Sit-to-Stand Motion Using Head Position Feedback

Loading next page...
 
/lp/hindawi-publishing-corporation/cartesian-control-of-sit-to-stand-motion-using-head-position-feedback-JpDOWJYQW0

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Publisher
Hindawi Publishing Corporation
Copyright
Copyright © 2020 Samina Rafique et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
ISSN
1176-2322
eISSN
1754-2103
DOI
10.1155/2020/1979342
Publisher site
See Article on Publisher Site

Abstract

Hindawi Applied Bionics and Biomechanics Volume 2020, Article ID 1979342, 13 pages https://doi.org/10.1155/2020/1979342 Research Article Cartesian Control of Sit-to-Stand Motion Using Head Position Feedback 1 1 2 1 Samina Rafique , M. Najam-ul-Islam, M. Shafique, and A. Mahmood Electrical Engineering Department, Bahria University, Islamabad 44230, Pakistan Biomedical Engineering Department, Riphah International University, Islamabad 44230, Pakistan Correspondence should be addressed to Samina Rafique; samina.rafique@bui.edu.pk Received 6 February 2020; Revised 17 July 2020; Accepted 12 August 2020; Published 21 August 2020 Academic Editor: Raimondo Penta Copyright © 2020 Samina Rafique et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Sit-to-stand (STS) motion is an indicator of an individual’s physical independence and well-being. Determination of various variables that contribute to the execution and control of STS motion is an active area of research. In this study, we evaluate the clinical hypothesis that besides numerous other factors, the central nervous system (CNS) controls STS motion by tracking a prelearned head position trajectory. Motivated by the evidence for a task-oriented encoding of motion by the CNS, we adopt a robotic approach for the synthesis of STS motion and propose this scheme as a solution to this hypothesis. We propose an analytical biomechanical human CNS modeling framework where the head position trajectory defines the high-level task control variable. The motion control is divided into low-level task generation and motor execution phases. We model CNS as STS controller and its Estimator subsystem plans joint trajectories to perform the low-level task. The motor execution is done through the Cartesian controller subsystem that generates torque commands to the joints. We do extensive motion and force capture experiments on human subjects to validate our analytical modeling scheme. We first scale our biomechanical model to match the anthropometry of the subjects. We do dynamic motion reconstruction through the control of simulated custom human CNS models to follow the captured head position trajectories in real time. We perform kinematic and kinetic analyses and comparison of experimental and simulated motions. For head position trajectories, root mean square (RMS) errors are 0.0118 m in horizontal and 0.0315 m in vertical directions. Errors in angle estimates are 0.55 rad, 0.93 rad, 0.59 rad, and 0.0442 rad for ankle, knee, hip, and head orientation, respectively. RMS error of ground reaction force (GRF) is 50.26 N, and the correlation between ground reaction torque and the support moment is 0.72. Low errors in our results validate (1) the reliability of motion/force capture methods and anthropometric technique for customization of human models and (2) high-level task control framework and human CNS modeling as a solution to the hypothesis. Accurate modeling and detailed understanding of human motion can have significant scope in the fields of rehabilitation, humanoid robotics, and virtual characters’ motion planning based on high-level task control schemes. 1. Introduction tion, Centre of Mass (CoM), Centre of Gravity (CoG), and Center of Pressure (CoP) and kinetic variables like ground reaction forces (GRF), joint torques, and ground reaction Sit-to-stand (STS) movement is a skill that helps determine the functional level of a person. The ability to rise from sitting torques play an important role as feedback elements in STS to standing is critical to a person’s quality of life, as it is linked motion control [2]. with the functional independence of an individual. Studies on Of all sensory inputs, head position and orientation too the hierarchy of disability indicate that problem in STS starts are the area of researchers’ interest. There is ample clinical evidence that head position feedback to CNS plays a role in at a later stage than problems in walking commence [1]. Bio- mechanical literature is replete with studies aimed at deter- the control of human motion and maintenance of balance. mining various variables that contribute to STS motion. The study [3] shows that human motion control and mainte- Kinematic variables like joint positions, velocities, accelera- nance of balance by CNS rely on inputs from vision, 2 Applied Bionics and Biomechanics and rehabilitation devices. Custom human biomechanical proprioception, tactile/somatosensory, and vestibular sys- tems. The multisensory integration, combined with motion models are based upon Body Segment Parameter (BSP) control, undergoes both quick and slow alterations which values. Reimer et al. [13] have given an overview of methods available for the estimation of BSP. Weighing coefficient- are termed as fast and slow dynamics in CNS, respectively. For any voluntary motion, CNS anticipates set patterns of based methods are convenient but the error in results can inputs from multisensory systems. Vestibular sense, in con- be up to 40%. Geometric approaches are good (error less than junction with neck proprioception, estimates body orienta- 5%), but tedious as the number of body part measurements tion. The vestibular system senses linear and angular head can go even higher than 240. Medical imaging is also accurate (error < 5%) but needs expensive equipment in addition to motion, and the CNS uses this information for posture and gaze control [4]. A study in [5] suggested that visual percep- dangerous exposure to radiation. Among all these methods, tion played a role in balance control during STS. The role of marker-based motion capture system was reported as the head position feedback to CNS in smooth execution of STS is accurate one, despite its limitations in terms of cost, the need also studied in [6], and the dependence of the STS movement for a controlled environment, high sensitivity to noise, line of sight capture, etc. on the Centre of Mass (CoM) and head positions during the task is analyzed. To validate the modeling technique, the simulated To evaluate clinical hypotheses, computer simulations act motion profiles are compared with experimental results. In as a powerful tool. Human motion can be synthesized and [12], the proposed 6-link human model was checked for its analyzed in a simulation environment. Like all other accuracy using references from experimental data. The rela- tion of two kinetic variables, GRF and reaction moments, motions, the behavioral richness exhibited in natural human STS transfer results from a complex interplay of biomechan- was recorded from subjects and compared with the same ical and neurological factors [7]. An adequate understanding forces resulting in simulations [14]. The regression plots of of these factors is a prerequisite to understanding the overall two variables endorsed similarity between them during the mechanism of human STS motion as well as providing a gait cycle. The validation of the modeling scheme through the experimental result is also done in [8, 15, 16]. In [17], means for its synthesis. In a broader sense, basic constituents of the human motor system include biomechanical plants the researcher collected data of STS motion using infrared and CNS. Based on some specified task, CNS performs motor cameras and force plates and applied the data to a multiseg- ment biomechanical model for the analysis of the kinematic planning which culminates low-level control issued as a motor command to biomechanical plant. Some knowledge contribution of major body segments. Synthesis of human-like motion finds its application of biomechanical plant is also assumed to be encoded in CNS. Typically, CNS is modeled to serve only a limited func- both in simulation and physical settings: in computer tion. One possible model which is the most commonly used graphics, this leads to autonomously generating realistic motion for virtual characters. The intent is to direct these is the joint space control. It is possible to divide motion con- trol into the task generation phase and a motor execution virtual characters using high-level task for which low-level motion control is automatically generated. Similarly, the phase. This abstraction is more relevant to the design of engi- neered systems that augment physiological systems. Synthe- robotics community seeks a high-level control framework sis of human motion involves accurate reconstruction of for robotic systems [7]. In this study, we evaluate the clinical hypothesis that movement sequences, modeling of musculoskeletal kinemat- ics, dynamics, and actuation of segment joints [8]. Task- besides numerous other factors, CNS controls STS motion by tracking a prelearned head position trajectory. Motivated based methods used in robotics may be leveraged to provide novel musculoskeletal modeling methods and physiologically by the evidence [7] for a task-oriented encoding of motion accurate performance predictions. Advantage of robotic- by the CNS, we adopt a robotic approach for the synthesis of STS motion and propose this scheme as a solution to this based effort models frequently utilizes quantities that are derivable purely from skeleton kinematics and that are not hypothesis. We propose an analytical biomechanical human specific to muscle actuation. Since the evaluation of a system CNS modeling framework where the head position trajectory can be only as good as the model itself, the accuracy of the defines the high-level task control variable. We do extensive results primarily depends upon the quality of the human bio- motion and force capture experiments on human subjects to validate our analytical modeling scheme. To the best of mechanical model. Human STS is performed almost entirely in the sagittal plane [9]. Typically, human biomechanical our knowledge, this is the first study of STS motion and force models comprise a multilevel inverted pendulum, whose capture in the sagittal plane (2D). We used marker-based motion is governed by Euler-Lagrange equations. For motion optical motion capture system and force plate (1) to collect analysis and development of a control scheme, usually, an kinematic and kinetic data during this voluntary motion and (2) to realize a custom human biomechanical model in analytical model based on general physical parameters is realized. Such models and control schemes are extensively the sagittal plane as close as possible to real human beings. available in the literature on motion analysis [10, 11] and We first scale our biomechanical model to match the BSP the design of robotic devices [12]. To analyze a motion mech- values of the subjects. We do dynamic motion reconstruction anism more accurately, the simulated motion must be com- through the control of the simulated custom human CNS models to follow the captured head position trajectories in pared with actual human motion. For this purpose, custom biomechanical models are developed that are more useful real time. This study is the part of an ongoing study that is in the design and tuning of customizable motion assistance aimed at determining different variables involved in human Applied Bionics and Biomechanics 3 reference for respective models. Simulated motions STS motion. The previous work [11, 18, 19] comprised the analytical approach, and this work is based on experimental are analyzed and compared with experimental analysis of STS motion. This paper is organized as follows: motion. The materials and methods section should first, we provide the details of the analytical modeling frame- contain sufficient detail so that all procedures can work for STS motion synthesis followed by the experimental be repeated. It may be divided into headed subsec- setup and data collection of STS motion on human subjects. tions if several methods are described Next, we discuss the human biomechanical model scaling for custom human models. We simulate each subject’s STS 3. Analytical Modeling Framework motion and compare them with experimental results. Finally, 3.1. The General Human Biomechanical Model. A general we discuss the validity of the proposed design methodology four-link rigid body human model (as shown in Figure 2) is for its physiological relevance to the STS maneuver. used to simulate STS motion. The physiological parameters of the model (as shown in Table 1) have been borrowed from 2. Materials and Methods literature including our previous work [11, 18, 19, 21]. The model has three degrees of freedom (DoF). Four We design a biomechanical human CNS model (as shown in links include the foot, shank, thigh, and the upper body, Figure 1) to synthesize and control STS motion by tracking which we termed as a single link called Head-Arm-Trunk only head trajectory X as a reference and head position X (HAT). A triangular base of support represents the foot fixed as the only measurement. Since the reliability of the motion on the ground. Since the key movements of joints and limbs control is primarily linked with the accuracy of the human during STS take place in the sagittal plane only, we limit our biomechanical model, we compare experimental and simu- model to planar two-dimensional (2D) motions (in the Car- lated forces and fine-tune the model to reduce the error to tesian plane). All joints are revolute (hinge-like), and the a minimum. Hence, force measurement does not play a role model is an open-chain mechanism with three actuators at in motion synthesis or control, it is meant only for modeling each joint. θ , θ , and θ represent ankle, knee, and hip joint 1 2 3 scheme validation. positions, respectively. We refer the shank, thigh, and HAT as links l , l , and l , respectively. (X, Y) is the head position, 2.1. The Analytical Modeling Framework for STS Motion 1 2 3 and (x, y) is the hip position in Cartesian coordinates. ∅ is Synthesis. We develop an analytical human CNS modeling the head orientation in the World frame {W}. framework to generate STS motion. Our modeling scheme comprises the following steps: 3.2. Analytical Reference Trajectory. The model tracks a generalized head position trajectory, generated analytically (i) A general four-segment human biomechanical using an unforced state-space system borrowed from [22], model in the sagittal plane based on BSP from and modified accordingly. the literature [2, 9–11, 19–21] is realized in SimMechanics 3.3. CNS Modeling: STS Controller Design. The CNS is modeled as STS controller comprising two subsystems: an (ii) We analytically generate head trajectory [22] to be Estimator and a Cartesian controller. used as the reference 3.3.1. Estimator. The estimation of joint angles is based on (iii) We design the STS controller to emulate human the inverse kinematics of the human biomechanical model. CNS, capable of (a) estimating joint angles using inverse kinematics based on head position measure- (1) Forward Kinematics (FK) Analysis. Forward kinematics ments X and (b) generating joint actuation torque maps joint space (θ ) into Cartesian space (x, y, ϕ) [23], commands τ by Cartesian control based on head where ϕ is the orientation of a point in the Cartesian plane position error δX with respect to the World reference {W}. To determine the head position (X, Y), the set of kinematic equations is 2.2. Experimental Validation of Modeling Scheme given as (i) The physical parameter data collected from 7 sub- jects are converted into BSP using the weighing coef- X = l c + l c + l c , ð1Þ 1 1 2 12 3 123 ficient method of anthropometry. BSP values are used to realize custom/subject-specific human bio- ð2Þ Y = l s + l s + l s , 1 1 2 12 3 123 mechanical models where c stands for cos ðθ Þ, c for cos ðθ + θ Þ, s for 1 1 12 1 2 1 (ii) We capture motion and force data of STS maneuver sin ðθ Þ, and so on. Also, from subjects using multiple infrared cameras and passive reflective markers. We extract custom head ϕ = θ + θ + θ , ð3Þ 1 2 3 trajectories from motion data and torques and ground reaction forces (GRFs) from force data where ϕ is the orientation of HAT (or head) with respect (iii) STS motion is reconstructed for each custom human to the x-axis. The generalized coordinate is a compact CNS model. Custom head trajectories are used as the notation p = ½X, Y, ϕ. 4 Applied Bionics and Biomechanics Demographic BSP data Anthropometric data conversion STS control Ref head position, Marker Inv Kin/ Human Kinematic Error velocity, orientation data Cartesian biomechanical variable 𝜒 − 𝜒 𝜏 controller model conversion d Measured head pos, velocity, orientation Measured Forceplate Experimental Kinetic data torque/GRF torque/GRF variable + − conversion Error Figure 1: Workflow of STS motion control scheme. 3.3.2. Cartesian Control. Cartesian control refers to the posi- tion control of the head, following a required trajectory in Cartesian space. XY (1) Equation of Motion. Dynamic equation of motion of the human biomechanical model in joint space is given by € _ τ = MðÞ θ θ + V θ, θ + GðÞ θ , ð9Þ {H} xy {T} € _ where θ, θ , and θ are n ×1 joint angular acceleration, veloc- I ity, and position vectors, respectively. MðθÞ is the n × n iner- {S} 𝜃 _ 1 tia matrix of the model, Vðθ, θÞ is n ×1 vector of centrifugal {F} x-axis {W} and Coriolis terms, GðθÞ is the n ×1 vector of gravity terms, and τ is the n ×1 torque vector. Modifying dynamic equation Figure 2: Three DoF biomechanical human model defined in the from joint space to Cartesian space [23], body frame. {S}, {T}, and {H} represent shank, thigh, and HAT frames for segments l , l , and l . 1 2 3 € _ F = M ðÞ θ X + V θ, θ + G ðÞ θ , ð10Þ x x x (2) Inverse Kinematics (IK) Analysis. To estimate joint angles, the IK problem is solved. First, p is used to find a unique hip where F is the appropriate force-torque vector, and X is the position (x, y) to reduce the problem at hand from four to position and orientation of the head in Cartesian space. M three links. To find hip position (x, y), hip joint angle con- ðθÞ is the mass-inertia matrix in Cartesian space and so on. straint, i.e., 0 ≤ θ ≤ π is imposed. The solution then sim- plifies A trajectory conversion process thus required x = X + l c π − ϕ , ð4Þ ðÞ θ = inv kin X , ð11Þ ðÞ d d y = Y − l s π − ϕ : ð5Þ ðÞ where X is the desired head position trajectory in Cartesian Using algebraic manipulation, the three joint angles space, and θ is the vector of corresponding joint angles. The inferred from head position are inv kin operator refers to the inverse kinematic procedure used for the inference of joint angles from the position of ð6Þ θ = atan2ðÞ s , c , 2 2 2 the end effector. θ = atan2 y, x − atan2 l s , l + l c , ð7Þ ðÞ ðÞ ðÞ 1 2 2 1 2 2 (2) Transpose Jacobian Control. In this scheme, measured position X is compared to desired position X to form an θ + θ + θ = atan2ðÞ sϕ, cϕ = ϕ, ð8Þ 1 2 3 error δX in Cartesian space. The error vector is then applied where atan2 is the MATLAB command for four-quadrant to control law to compute the Cartesian force vector, F, -1 tan with arguments in bracket representing vertical and which is that fictitious force if applied at the head will tend horizontal components of the position vector. to reduce Cartesian error. The Cartesian force vector is then y-axis Applied Bionics and Biomechanics 5 Table 1: BSP data for analytical biomechanical model [9]. Segment Mass (kg) Length/height (m) Center of gravity (m) Moment of inertia (kg m ) Foots 1.91 0.07 —— Shanks 6.14 0.43 0.25 0.11 General human model Thighs 13.20 0.43 0.25 0.26 HAT 44.75 0.83 0.31 7.53 mapped into joint torque vector τ using transpose Jacobian (5) Static Forces in the Human Model. Forces and moments conversion. propagate from segment to segment. Torques on joints must be applied to keep the system in static equilibrium. Jacobian (3) The Velocity of the Head. Description of angular velocity (J) in force domain maps force on the head into torques ω of link i +1 with respect to respective frame is given by on joints: iþ1 iþ1 i iþ1 _ ̂ ω = R ω + θ Z , ð12Þ i+1 i i i+1 i+1 T τ = J F, ð19Þ where i = 0,1,2 refers to link number, R is a rotation matrix, and Z is the axis of joint rotation. The linear velocity v is where F is the Cartesian force required to act on the given by head. iþ1 iþ1 i i i v = R v + ω × P , ð13Þ i+1 i i i i+1 (6) Cartesian Control Law Design. The control scheme is based upon the hypothesis that the feedback of head position where P is the head position vector. For the model shown in X to CNS, i.e., the STS controller, plays a role in carrying out Figure 3, the angular and linear velocity components of the STS motion. As shown in Figure 4, using the measured posi- head in three axes are given, respectively: tion of head X and comparing with desired/reference head 2 3 trajectories X , the CNS generates error signal δX. From 6 7 head position measurements, the Estimator part of CNS 3 2 6 7 − ω = ω = , ð14Þ 3 2 infers joint positions (θ ), required to reduce the error δX. 4 5 Similarly, the head position errors fed back to CNS generate _ _ θ + θ 1 2 torque command to the joints using the Cartesian control 2 3 law. Since Cartesian control is usually implemented in force l s θ 1 2 1 domain, the controller generates a force command F. Then, 6 7 6 7 the transpose Jacobian converts force command F into tor- _ _ _ v = : ð15Þ 6 l c θ + l θ + θ 7 1 2 1 2 1 2 4 5 que command τ, for joint actuation. 4. Validation Framework To find these velocities with respect to fixed foot-shank The human CNS modeling scheme to synthesize STS motion frame {F}, using the rotation matrix is designed in a purely analytical framework. To validate our 2 3 modeling framework and the hypothesis that CNS control of c −s 0 12 12 STS motion has a dependence on head position trajectory 6 7 0 0 1 2 6 7 R = R R R = , ð16Þ s c 0 feedback, we must check the model for its ability to replicate 3 3 2 3 12 12 4 5 experimental STS motion using custom/subject-specific 00 1 models. A comparison of simulations and experimental find- ings will be the basis of the validity of our control framework. 2 3 _ _ _ −l s θ − l s θ + θ The second phase of our study starts from scaling our analyt- 1 1 1 2 12 1 2 6 7 ical human model to custom models. 6 7 0 3 6 7 v = R v = : ð17Þ _ _ 3 3 3 6 l c θ + l c θ + θ 7 1 1 1 2 12 1 2 4 5 4.1. Subject’s Physical Parameters and Anthropometric Conversion. Experimental data of sit-to-stand transfer were collected at Biomechanics Lab of Riphah International (4) The Jacobian. Jacobian is a nonlinear time-varying matrix University. Seven healthy subjects (five males and two that relates joint angular velocities to linear head velocity: females, age: 22 ± 0:81 years, mass: 72:58 ± 11:61 kg, height: 1:70 ± 0:04 m) were selected for data collection of sit-to- "#"# stand motion. The subjects had no history of movement −l s − l s −l s 1 1 2 12 2 12 0 0 v = JðÞ θ θ = : ð18Þ disorder. They provided their informed consent under the l c + l c l c 1 1 2 12 2 12 2 Ethics Committee of Riphah International University. 6 Applied Bionics and Biomechanics (a) (b) (c) (d) Figure 3: Motion capture equipment by OptiTrack. (a) Calibration square, (b) calibration wand, (c) infrared camera, and (d) the Pasco force plate for force data capture. Reference Measured Torque head Force head Human Error command trajectories command trajectories Cartesian Transpose biomechanical 𝜒 controller Jacobian 𝜏 𝜒 𝜒 −𝛿 󰫭 model Joint angles Inverse kinematics Figure 4: STS control scheme to emulate CNS. Subjects’ physical parameter data (as shown in Table 2) able. Of these, marker-based motion capture is termed as are used to calculate BSP. An extensive literature is available one of the most accurate methods. To measure ground reac- tion forces, a force platform has been used. These methods on methods of anthropometric conversion. Among the vari- ous methods available in the literature [13], we have used the are extensively used in literature for modeling and analysis method of weighing coefficient [14], which is widely accepted of biomechanical motion mechanisms [2, 15, 16]. For a among the research community. For brevity, only one repre- detailed description of our experimental work, refer to [21] sentative data out of a total of 7 subjects is presented in 4.2.1. Experiment Protocol. Subjects completed the STS task Table 3. using an armless chair 49 cm from the force-plate (as shown 4.2. STS Motion and Force Capture. Reference [24] provides in Figure 5). To collect the data in the sagittal plane, three comprehensive coverage of motion capture methods avail- spherical reflective markers on the left side of each segment, Applied Bionics and Biomechanics 7 system. Before each trial, we checked the force plate for Table 2: Subjects’ physical parameter data. zero error. Subject ID Gender Age (year) Mass (kg) Height (m) 4.2.3. Data Collection and Analysis Tools. Each marker was 1 Male 21 76.55 1.69 manually numbered in the captured data file. Markers were 2 Male 22 79.81 1.70 then grouped into segments. Segment labels, too, were 3 Male 21 50.05 1.69 assigned manually in Motive Edit mode for each trial. Motive 4 Female 22 66.56 1.61 2.0.1 generates motion capture data in .tak and .c3d file 5 Female 22 84.91 1.67 formats. For data analysis, we have used a motion capture 6 Male 23 71.05 1.72 software MoCap, a freely available motion data analysis tool- 7 Male 23 79.10 1.78 box that works seamlessly with Matlab. Force-plate data is collected from four-beam setup, that provides vertical and horizontal forces generated under both feet during STS. Force data is recoded in .cap format and exported into excel Table 3: BSP data based on the subject’s physical parameters. .csv format for analysis. Center of Moment of Subj Mass Length/height 4.2.4. MoCap Data Analysis. Motion data in .c3d format was Segment gravity inertia ID (kg) (m) imported into the Matlab MoCap toolbox for analysis. (m) (kg m ) Marker positions were converted into joint positions. Then, Foots 2.22 0.066 —— angular positions of each joint in every frame were calcu- Shanks 7.11 0.419 0.237 0.114 lated. Similarly, the head position trajectory was constructed Thighs 15.31 0.417 0.236 0.278 using a marker on the head. Marker data and joint data were used to animate the STS transfer of the subjects. HAT 51.90 0.801 0.299 8.199 Data on subject # 5 were corrupted and hence were rejected. Figure 6(a) shows the ensemble average of head i.e., foot, shank, thigh, and trunk, were attached. Since position; Figure 6(b) shows the ensemble average of ankle, markers pose problems in the segment and joint position knee, and hip joint trajectories; and Figure 6(c) shows the assessment due to skin or loose garment artifacts, a set of ensemble average of ground reaction force of all six subjects. markers on each segment were applied using rigid rulers. Standard deviation curves in dashed lines show the magni- One marker was attached on top of the head using a hair- tude of intrasubject variation. band. Motion capture was done using four infrared Flex 3 4.3. STS Motion Control for Custom Models. We reconstruct cameras by OptiTrack. The data were recorded at 100 Hz STS motion using a custom human analytical STS controller using the OptiTrack Motive 2.0.1 software. Force data were framework. Subject-specific head position trajectories extracted recorded at the same time using a 2-axis 4-beam Pasco force from motion capture data are used as the reference. plate. The force data were captured at 100 Hz using the Capstone software. Each subject completed multiple STS trials. All trials were 5. Results done at once. Each trial began with the subject seated in the 5.1. Simulations. The ensemble averages of all motion and chair, arms crossed across the chest. The trial started with a force data obtained from the experiment and simulations verbal command of “stand,” and then, data were recorded are calculated and compared. The plots of the kinematic for approximately 4 sec. After this, the subject was again variable are shown in Figure 7 through Figure 8. asked to be seated and then trial repeated. 6. Discussion 4.2.2. Equipment and Calibration. To the best of our knowl- edge, this is the first study of STS motion capture in the sag- In this study, we propose a modeling and motion control ittal plane (2D); hence, there are no definite rules available in solution to evaluate the clinical hypothesis that besides the literature about the appropriate positions and number of numerous other factors, CNS controls the STS motion by markers placed on the body segments. Neither any research tracking a prelearned head position trajectory. CNS com- suggests an optimum number of cameras for reliable motion pares this anticipated head motion pattern with actual head capture. Literature, in general, is about 3D motion capture position measured by vestibular, proprioception, and vision [24, 25]. We have, therefore, opted for a multiple-camera sys- senses. Based on the head position error, CNS generates tor- tem, along with spherical markers to ensure better visibility que commands for joints actuation so that a smooth STS and reliable data reconstruction by the system. Cameras were motion may result. Motivated by the evidence for a task- arranged such that complete coverage of motion area could oriented encoding of motion by the CNS [7], we present a be ensured. Camera calibration using “calibration wand” human CNS modeling scheme to synthesize and control and determination of frame of reference for motion capture STS motion using an analytically generated head position area using “calibration square” were done before motion cap- trajectory in a high-level task control framework. ture started. We used the 2-axis Pasco force platform for First, we realize a 4-segment 3-DoF analytical human force data capture at 100 Hz in the Capstone data acquisition biomechanical model based on anatomical proportions [9] 8 Applied Bionics and Biomechanics (a) (b) Figure 5: STS data capture setup: (a) a subject with markers affixed on segments. The feet are placed on force-plate, (b) motion capture view and cameras in Motive 2.0.1 environment. Average head position trajectories from experiment Average joint trajectories from experiment 0 0.2 2 1.5 −0.1 1.5 −0.2 −0.2 −0.4 0.5 −0.6 −0.3 −0.8 0.5 −0.4 −1 −0.5 −1.2 −0.5 0 020 40 60 80 100 0 50 100 0 50 100 0 50 100 % STS cycle % STS cycle (a) (b) Average vertical GRF from experiment 0 10 20 30 40 50 60 70 80 90 100 % STS cycle (c) Figure 6: Ensemble average trajectories of (a) head position, (b) joint angles, and (c) GRF using motion capture. Curves in dashed lines represent ±1 standard deviation (SD). Head positions X(m), Y(m) Body weight (N) Ankle (rad) Knee (rad) Hip (rad) Applied Bionics and Biomechanics 9 Errors between experimental and simulated head position of the STS cycle of all trials, and both the motion and force 0.25 data were trimmed and normalized for % STS cycle. The marker data were then converted into six joints data (as 0.2 shown in Figure 9(b)) which closely resembles the analytical model depicted in Figure 2. Experimentally generated head 0.15 position trajectories in Figure 6(a) closely resemble the ana- lytically generated general head position trajectory in 0.1 Figure 10. The motion was then reconstructed in control and simulation framework by tracking the head marker 0.05 trajectories in real time. Figure 7 gives a comparison of experimental and simu- lated head position trajectories in horizontal (X) and vertical (Y) directions. The Cartesian control part of the STS control- −0.05 ler provides appropriate joint torques to minimize head posi- −0.1 tion error δX. The RMS error for X =0:0118 m and for 0 20 40 60 80 100 Y =0:0315 m. This shows very good tracking of reference % STS cycle input X by the STS controller. Experimental, estimated, and simulated joint angles are plotted in Figure 11. Estimated Horizontal position error and simulated joint angles are compared with experimental Vertical position error joint angles. RMS error for ankle = 0:55 rad (estimation), 0.54 rad (simulation), for knee = 0:93 rad (both), and for Figure 7: Comparison of ensemble average head position trajectories hip = 0:59 rad (both). The joint angle errors are relatively from motion capture experiments and simulations. RMS error for horizontal position X =0:0118 m and for Y =0:0315 m. high and attribute to the use of the same controller for a vari- ety of custom human models and head position trajectories that exhibit relatively large intrasubject variations. The joint in the sagittal plane. We realize the CNS model as an STS angle error can be reduced significantly if (1) the controller is tuned for each custom model and (2) simulation is run controller having two subsystems: an Estimator to automati- cally plan joint level motions and the Cartesian controller to with subject-specific initial conditions. Another reason for larger joint angle errors attributes to the fact that the STS generate appropriate joint torque commands to reduce head control strategy is based on head position tracking, and there position error. Our previous work [11, 18, 19] and some work from the are no joint position reference inputs and measurements being used. This is evident from small errors between exper- literature [2, 9, 10, 20, 26] were based on the same analytical imental and simulated head position trajectories; Figure 8 human model (realized in mathematical or simulation plots head orientation curves ∅, measured from experiments frameworks) using different combinations of measurements, and simulations. A small RMS error of 0.0442 rad for head feedbacks and controllers. We did the analytical design in the orientation shows good estimation and tracking of the head first phase to relate and compare our current study with the trajectory by the controller. Kinetic variables are plotted previous work. Using a well-defined human model and sim- and analyzed next. Figure 12 shows how the force F exerted ulation results from previous studies helped us design and by the bodyweight during STS changes. At the start of the fine-tune the STS controller that could produce comparable STS cycle, the initial force of 200 N shows the average weight results. As a standard procedure [8, 15, 16, 21], we validated of the two feet, shanks, and partially of thighs, while seated. our modeling and control scheme framework with laboratory With seat off, the weight on the force plate increases and so data as well. does the vertical component of the ground reaction force. Physical parameters data of the 7 subjects (as shown in The GRF measured from simulations is plotted as F . The Table 2) are converted into BSP values using the weighing coefficient method of anthropometry. BSP values in Table 3 two forces match closely (RMS error 50.26 N only) and settle are used to scale customhuman models to match the anthro- to the final value of the subject’s average weight. Support pometry of the subjects. moment M is the sum of ankle, knee, and hip joint torques. We capture experimental kinematic data of STS motion Ground reaction torque is a function of ankle joint torque in the sagittal plane using four OptiTrack Flex-3 cameras [26]. We have found that a relatively high correlation (0.72) and thirteen spherical reflective markers on four segments exists between ground reaction moment M and the support of each subject. Kinetic data were collected at the same time moment M as can be seen in Figure 13. The low RMS errors using the Pasco force-platform underneath both feet of the between experimental and simulated measurements validate subjects. The marker data were recorded in the OptiTrack our modeling framework. Figure 14(a) depicts snaps from Motive environment and then imported and analyzed using the animation of experimental STS. Figure 14(b) shows STS MoCap and MATLAB. The motion was reconstructed from motion phases from simulation, based on the customized marker data (as shown in Figure 9(a)). The animated motion human model in SimMechanics. The close resemblance helped check the data for missing markers and frames. between the animation of experimental data and simulation The missing data were reconstructed using interpolation. shows the good quality of STS motion control which attri- The animation also helped determine the start and end butes to (1) robust design of analytically developed STS Error (m) 10 Applied Bionics and Biomechanics Average head orientation from experiment and simulations 1.6 1.55 1.5 1.45 1.4 1.35 1.3 1.25 0 10 20 30 40 50 60 70 80 90 100 % STS cycle exp sim Figure 8: RMS error =0:0442 rad for head orientation ∅, obtained from the average of experimental data and measurements from simulations. (a) (b) Figure 9: STS transfer phases with motion trajectories from animation based on (a) marker data and (b) joint data. controller to model CNS, (2) reliability of experimental data General head position trajectories capture techniques employed, and (3) low error factor in BSP conversion from weighing coefficient method to obtain customized human biomechanical models. 1.5 6.1. Assumption and Limitations. The subjects’ physical parameters were converted into complete set of BSP using weighing coefficient method, which is a mathematical 0.5 method of anthropometery. Despite the risk of high error in estimation [13], this method is widely accepted in research community due to its convenience as compared to other methods that need special equipment for body segment mea- −0.5 surements. The estimation error, however, leads to modeling 020 40 60 80 100 error that becomes a source of mismatch in experimental and % STS cycle simulation results. Moreover, there is a lack of protocols for motion capture in 2D. We devised a set of protocols for this gen experiment which we kept modifying until a satisfactory level gen of reliable results was achieved. There were some limitations Figure 10: Analytically generated general head position trajectory. associated with experimental equipment as well: (1) we did General head position (m) Head orientation (rad) Applied Bionics and Biomechanics 11 Average joint angles from experiment, simulation and estimation 0.2 0.5 1.6 1.4 0 0 1.2 −0.2 −0.5 0.8 −0.4 −1 0.6 −0.6 −1.5 0.4 0.2 −0.8 −2 −1 −2.5 −0.2 −1.2 −0.4 −3 0 50 100 0 50 100 0 50 100 % STS cycle Experiment Simulation Estimation Figure 11: Comparison of average experimental joint trajectories with estimated and simulated trajectories. RMS error for ankle = 0:55 rad (estimation), 0.54 rad (simulation), for knee = 0:93 rad (both), and for hip = 0:59 rad (both). Average GRF from experiment and simulations Another assumption was made by using same motion con- troller for all subject specific human biomechanical models. 700 Further improvement in work could be made if controller were tuned separately for each scaled model. For this study, our modeling scheme was based on rigid body segments; such assumption leads to modeling error of the systems like human body that are not exactly rigid. 7. Conclusion A modeling framework to evaluate the role of head position trajectory in physiologically relevant STS motion control by the CNS is presented. A robotic approach for the synthesis 0 204060 80 100 of STS motion using task-level control is utilized. We % STS cycle mapped a scaled dynamic human model to the human sub- F jects’ anthropometric values and simulated STS motion by F tracking head position trajectories in real time. The study contributes to the knowledge base by proposing a system that Figure 12: Average ground reaction force curve F , measured by (1) synthesizes human motion using a high-level task control the force platform, showing the trajectory of body weight variation framework, for which low-level motion control is automati- during STS by the subjects. F shows the same variable measured cally generated and (2) validates a 2D biomechanical model- during subject-specific simulations. The RMS error between the ing scheme based on the weighing coefficient method for two curves = 50:26 N. inference of Body Segment Parameter (BSP). The modeling scheme is validated using kinematic and kinetic analyses of simulated and captured motion and force data of real sub- not have specialized skin tight garments for subjects. Since jects. The analytically designed STS controller is robust markers pose problems in the segment and joint position enough to simulate real subjects’ STS motion. Low errors assessment due to skin or loose garment artifacts, a set of between experimental and simulated motions not only prove markers on each segment were applied using rigid rulers. the validity of the modeling framework but support the (2) The motion capture equipment and force plate were not clinical hypothesis that there exists a role of head position synchronized electronically; the two variables were visually measurement feedback to CNS in controlling a smooth analyzed from captured data for time synchronization. STS motion. Body weight (N) Ankle (rad) Knee (rad) Hip (rad) 12 Applied Bionics and Biomechanics Average ground reaction torque and support moment −500 −1000 −1500 −2000 −2500 0 10 20 30 40 50 607080 90 100 % STS cycle Figure 13: Average ground reaction torque M and support moment M (sum of joint torques). The two variables’ correlation = 0:72. z s (a) (b) Figure 14: (a) Phases of STS from motion capture. Trajectories of joints are also shown. (b) Simulated STS motion in SimMechanics environment. Torque (Nm) Applied Bionics and Biomechanics 13 in Basic In the future, we want to modify the human biomechan- & Clinical Pharmacology & Toxicology. Vol. 124, Wiley, NJ USA, 2019. ical modeling scheme from rigid body kinematics to account for elastic body links to better match subject-specific anthro- [12] M. Geravand, P. Z. Korondi, C. Werner, K. Hauer, and A. Peer, “Human sit-to-stand transfer modeling towards intuitive and pometry. Our hypothesis and findings can be further gener- biologically-inspired robot assistance,” Autonomous Robots, alized to all kinds of human motion syntheses like walking vol. 41, no. 3, pp. 575–592, 2017. and stair climbing. [13] R. Riemer and E. T. Hsiao-Wecksler, “Improving net joint tor- que calculations through a two-step optimization method for Data Availability estimating body segment parameters,” Journal of biomechani- cal engineering, vol. 131, no. 1, article 011007, 2009. Readers can request the corresponding author for motion [14] D. A. Winter, Biomechanics and motor control of human and force capture datasets. movement, John Wiley & Sons, 4th edition, 2009. [15] E. J. Caruthers, J. A. Thompson, A. M. W. Chaudhari et al., Conflicts of Interest “Muscle forces and their contributions to vertical and horizon- tal acceleration of the center of mass during sit-to-stand trans- The authors declare that there is no conflict of interest fer in young, healthy adults,” Journal of Applied Biomechanics, regarding the publication of this paper. vol. 32, no. 5, pp. 487–503, 2016. [16] M. K. Cullen, Muscle-driven simulations of sit to stand transfer in persons with severe osteoarthritis, [M.S. thesis], The Ohio Acknowledgments State University, 2015. APC is may be covered by Bahria University. [17] I. Y. Campos Padilla, Biomechanical analysis of the sit-to-stand transition [Ph.D. thesis], School of Mechanical, Aerospace and Civil Engineering, 2016. References [18] S. Rafique, M. Najam-l-Islam, and A. Mahmood, “Synthesis of sit-to-stand movement using SimMechanics,” Proceedings of [1] R. C. van Lummel, Assessing sit-to-stand for clinical use, 2017, the 1st International Conference on Smart Innovation, Ergo- https://research.vu.nl/en/publications/assessing-sit-to-stand- nomics and Applied Human Factors (SEAHF). SEAHF 2019. for-clinical-use. Smart Innovation, Systems and Technologies, vol 150, C. Bena- [2] A. M. Mughal and K. Iqbal, “Optimization of biomechanical vente-Peces, S. Slama, and B. Zafar, Eds., Springer, Cham, STS movement with linear matrix inequalities,” International Journal of Mechatronics Systems and Control, vol. 47, no. 1, [19] S. Rafique, A. Mahmood, and M. Najam-ul-Islam, “Robust pp. 1–11, 2019. control of physiologically relevant sit-to-stand motion using [3] R. Chiba, K. Takakusaki, J. Ota, A. Yozu, and N. Haga, reduced order measurements,” Proceedings of the Future Tech- “Human upright posture control models based on multisen- nologies Conference (FTC) 2018. FTC 2018. Advances in Intel- sory inputs; in fast and slow dynamics,” Neuroscience research, ligent Systems and Computing, vol 881, K. Arai, R. Bhatia, vol. 104, pp. 96–104, 2016. and S. Kapoor, Eds., Springer, Cham, 2019. [4] P. A. Forbes, G. P. Siegmund, A. C. Schouten, and J.-S Ã.©b. [20] M. Mughal and K. Iqbal, “A fuzzy biomechanical model for Blouin, “Task, muscle and frequency-dependent vestibular H suboptimal control of sit-to-stand movement,” Interna- control of posture,” Frontiers in Integrative Neuroscience, ∞ tional IASTED Conference on Intelligent Systems and Control, vol. 8, p. 94, 2015. [5] A. Siriphorn, D. Chamonchant, and S. Boonyong, “The effects [21] S. Rafique, M. N. Islam, M. Shafique et al., “Position driven sit- of vision on sit-to-stand movement,” Journal of Physical Ther- to-stand simulation using human body motion and force apy Science, vol. 27, no. 1, pp. 83–86, 2015. capture,” in 2019 22nd International Multitopic Conference [6] J. P. Scholz, D. Reisman, and G. Schöner, “Effects of varying (INMIC), Islamabad, Pakistan, Pakistan, November 2019. task constraints on solutions to joint coordination in a sit-to- [22] A. M. Mughal and K. Iqbal, “Synthesis of angular profiles for stand task,” Experimental Brain Research, vol. 141, no. 4, bipedal sit-to-stand movement,” in 2008 40th Southeastern pp. 485–500, 2001. Symposium on System Theory (SSST), pp. 293–297, New [7] D. S. Vincent and R. Chen, “A task-level biomechanical frame- Orleans, LA, USA, 2008 March. work for motion analysis and control synthesis,” Human Mus- [23] J. J. Craig, Introduction to robotics: mechanics and control,vol. 3, culoskeletal Biomechanics, 2012. Pearson/Prentice Hall, Upper Saddle River, NJ, USA, 2005. [8] O. Khatib, E. Demircan, V. de Sapio, L. Sentis, T. Besier, and S. Delp, “Robotics-based synthesis of human motion,” Journal [24] E. van der Kruk and M. M. Reijne, “Accuracy of human of Physiology-Paris, vol. 103, no. 3-5, pp. 211–219, 2009. motion capture systems for sport applications; state-of-the- art review,” European Journal of Sport Science, vol. 18, no. 6, [9] K. Iqbal and Y. C. Pai, “Predicted region of stability for balance pp. 806–819, 2018. recovery: motion at the knee joint can improve termination of forward movement,” Journal of Biomechanics, vol. 33, no. 12, [25] A. Bilesan, M. Owlia, S. Behzadipour et al., “Marker-based pp. 1619–1627, 2000. motion tracking using Microsoft Kinect,” IFAC-PapersOnLine, [10] M. A. Mahmood and K. Iqbal, “Physiological LQR design for vol. 51, no. 22, pp. 399–404, 2018. postural control coordination of sit-to-stand movement,” [26] A. M. Mughal and K. Iqbal, “Experimental analysis of kinetic Cognitive Computation, vol. 4, no. 4, pp. 549–562, 2012. variables for biomechanical sit to stand movement,” in 34th [11] S. Rafique, M. Najam-ul-Islam, and A. Mahmood, “Sit-to- Annual Meeting of American Society of Biomechanics, Provi- stand motion control using head position feedback to CNS,” dence, RI USA, 2010.

Journal

Applied Bionics and BiomechanicsHindawi Publishing Corporation

Published: Aug 21, 2020

References