Access the full text.
Sign up today, get DeepDyve free for 14 days.
Purpose – Basic capturing of emotion on user experience of web applications and browsing is important in many ways. Quite often, online user experience is studied via tangible measures such as task completion time, surveys and comprehensive tests from which data attributes are generated. Prediction of users’ emotion and behaviour in some of these cases depends mostly on task completion time and number of clicks per given time interval. However, such approaches are generally subjective and rely heavily on distributional assumptions making the results prone to recording errors. This paper aims to propose a novel method – a window dynamic control system – that addresses the foregoing issues. Design/methodology/approach – Primary data were obtained from laboratory experiments during which 44 volunteers had their synchronized physiological readings – skin conductance response, skin temperature, eye movement behaviour and users activity attributes taken by biosensors. The window-based dynamic control system (PHYCOB I) is integrated to the biosensor which collects secondary data attributes from these synchronized physiological readings and uses them for two purposes: for detection of both optimal emotional responses and users’ stress levels. The method’s novelty derives from its ability to integrate physiological readings and eye movement records to identify hidden correlates on a webpage. Findings – The results from the analyses show that the control system detects basic emotions and outperforms other conventional models in terms of both accuracy and reliability, when subjected to model comparison – that is, the average recoverable natural structures for the three models with respect to accuracy and reliability are more consistent within the window-based control system environment than with the conventional methods. © Fatima M. Isiaka, Awwal Adamu and Zainab Adamu. Published in International Journal of Crowd Science. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full International Journal of Crowd attribution to the original publication and authors. The full terms of this licence may be seen at http:// Science creativecommons.org/licences/by/4.0/legalcode pp. 257-270 Emerald Publishing Limited The authors would like to thank the Nasarawa State University for sponsoring this paper, Federal 2398-7294 University of Technology, Minna and Amodu Bello University for their contributions. DOI 10.1108/IJCS-06-2021-0018 Research limitations/implications – Graphical simulation and an example scenario are only provided IJCS for the control’s system design. 5,3 Originality/value – The novelty of the proposed model is its strained resistance to overﬁtting and its ability to automatically assess user emotion while dealing with speciﬁc web contents. The procedure can be used to predict which contents of webpages cause stress-induced emotions to users. Keywords Dynamic control system, Eye tracker sensors, Human physiological functions, Online behaviour, Skin conductance response, Skin temperature Paper type Research paper 1. Introduction Emotion recognition based on user experience of web applications and browsing is very useful in a lot of ways. Physiological readings can be captured as a form of distributional data. Such data are used by web designers and developers in enhancing navigational features of web pages. Also, rehabilitation therapists, mental-health specialists and other biomedical personnel often use computer simulations to monitor and control the behaviour of patients (Chen et al., 2000; Skadberg and Kimmel, 2004). Marketing and law enforcement agencies are perhaps two of the most common beneﬁciaries of such data – with the success of online marketing increasingly requiring a good understanding of customers’ online behaviour. For long, law enforcement agents have also used human physiological measures to determine the likelihood of falsehood in interrogations (Isiaka, 2017; Nielsen, 1994; Zhai and Barreto, 2006; Ugur, 2013; Filipovic and Andreassi, 2001; Smith et al., 1999). There are presently different types of biosensors that can measure the emotional state of arousal to static and dynamic webpages, this paper is limited to the basic ones such as skin conductance response (SCR), pupil dilation (PD) and skin temperature (ST). These physiological responses (SCR) is a function f(s) of electrical changes s of the skin as a result of sweat. These sensors measure the electrodermal activity as it grows higher during states such as interest, attention or nervousness, and lower during states such as relaxation or boredom [equation (1)], depending on the task the user is involved in: Stressed ðhighÞ fsðÞ ¼ (1) Relaxed ðlowÞ8sej where j can be any emotional arousal state. This expression can be further expanded in equation (2). The emotional state f(s) can be stress, neutral or relaxed mood which is equivalent to 0, 1 and s,if j (positive or negative affect) is substituted in the equation; s is a neutral mood that is neither 1 nor 0. User experience can be reﬂected and determined through their attitude or behaviour (Ugur, 2013). A negative attitude towards a complex application shows a poor experience, whereas a positive attitude towards a complex application shows a good experience (Saint-Aime et al.,2009). The three possible emotional states discussed here that can be experienced by a user during an interactive session could be expressed as stress, relaxation and a neutral mood, and canbe demonstratedinthe following concept: ðÞ ðÞ ðÞ < 1 j 0 j s : if relaxed ðpositive moodÞ fsðÞ ¼ ðÞ ðÞ 1 j s þðÞ j 1 : if stressed ðnegetive moodÞ (2) s : if neutral ðneutral moodÞ where 0 : if negative affect Integration of j ¼ 1 : if positive affect biosensor fsðÞ ¼ user emotion with ¼k j ¼ user experience emotion Some websites have the potential to induce mixed emotions such as anger and frustrations which mostly make the user uncomfortable and dissatisﬁed at that moment; this kind of reaction from the users induce emotion which is normally termed as stress, especially when they are using the application for the ﬁrst time. If a user is relaxed during an interactive session with an interface then we say the user ﬁnds the application less complex and easy to deal with (Nielsen and Molich, 1990; Nielsen, 1994; Healey and Picard, 2005; Isiaka, 2017). Distinguishing and ﬁnding the similarities between stress, relax and neutral affect can be achieved by conducting an experiment that involves the use of physiological measuring sensors to monitor the users’ reaction and collect the data. The stimuli eliciting psychological stress that involve stimuli conditions in the form of webpages with static and dynamic contents were considered; these are laboratory based stimuli situations achieved by deactivating some contents on the webpages and taking note of how this affects the users. 1.1 Objectives The main objectives of this paper includes: elicit user interaction and physiological response related data from sampled users using biosensors in ergonomic laboratory; developing a window-based dynamic control system for detecting user emotions on webpages; integrate the control system to biosensor for generalised data extraction and analysis; and making an analytical comparison with neural network and logistic regression. Most of the time users do not feel comfortable talking about their experiences in a usability study, such as letting people know what they really think or feel about a particular interface. This might be due to the fact that they feel it is socially inappropriate or they feel that they are the problem rather than the interface, this has been noted in older participants (Gross et al., 1997). Objective measures do not rely on a user’s experience or assessment, rather they record and measure time and task completion (Bergstrom and Schall, 2014)as user attributes, one novel approach we applied was adding physiological attributes. Physiological response measurements allow for further collection of objective measures of performance (Nielsen, 1994; Nielsen and Molich, 1990; Vermeeraen et al., 2010; Sauer and Sonderegger, 2009), rather than asking participants if they ﬁnd a task difﬁcult or if they were surprised or their attention was divided when visual stimuli like dynamic content suddenly appeared on screen. The SCR (Andreassi, 2000) can be used to measure their reaction. Objective skin conductance data, when combined with eye-tracking data, can give a different view of the UX, such as a user experiencing emotional arousal with the sudden appearance of dynamic content (Bergstrom and Schall, 2014). Sometimes users may subjectively rate what they feel as non-excited, not-amused, not-interested, or not-stressed, but their physiological response readings may reveal that at that point in time emotional responses occurred which indicates an increase in amplitude that signiﬁes excitement, IJCS amusement, interest or stress (Davis, 1990; Kolakowska et al., 2013; Vasalou et al.,2004). 5,3 The physiological measures used for the purpose of this study are brieﬂy discussed below, most of the user attributes used for the development of the proposed model are also mentioned. 1.2 Pupil dilation and eye movement PD does not only reveal changes in light intensity, it is also a measure of underlying cognitive processes (Figure 1) as user interacts with visual contents. It provides indices of attention, interest or emotion which are correlated with mental workload and arousal. The variations in pupil change and the average pupil change for a given time interval are considered to be important when measuring eye movement and the behaviour of users in reaction to visual stimuli (Iqbal et al.,2004). Eye movement is the behaviour of the eye during interaction; the eye gaze pattern is a measure of behaviour. The movement of a user’s eyes is based on ﬁxations (location of a user’s eye gaze), saccades (rapid movements of the eye from one ﬁxation to another) and ﬁxation duration (length of time a user ﬁxates on a particular area) (Figure 2)(Bergstrom and Schall, 2014). These parameters are essential when modelling user interaction and physiological response synchronisation, because they are important attributes in terms of emotion detection. The eye movement data obtained includes the PD and ﬁxations captured by the eye tracker. The derived variable is the saccade size D that gives the Euclidean distance between two ﬁxation points (x , y ) and (x , y ): n n m m 2 2 D¼ x y þ .. . þ x y :5 (3) ðÞ 1 1 ðÞ m m where x , y are ﬁxation points on the vertical plain of a webpage and (x , y ) are the n n m m ﬁxations on the horizontal plain. For the SCR, the threshold is used to distinguish one consecutive peak from another and deﬁnes the tonic phase (baseline). Optimal response (X) on the SCR can be detected based on a given threshold that corresponds to a participant’s response at onset and half recovery time of SCR: i¼n ðÞ X ¼ (4) i¼n Figure 1. Components of the pupil Integration of biosensor Figure 2 Fixations and saccades caused by eye movement where k are the data points of the physiological measures, 2 n X ¼ A þ A x þ A x þ .. . þ A x ; A ¼ CoefficientsðScalersÞ and x = scalar kþ1 0 1 2 n i i variables. This is applied to the physiological signal P such that: 2nþ1 d P X ¼ T (5) k k 2nþ1 dt X is the resulted data points by resampling the raw SCR signal P , taking a window size or k k polynomial order of 2n þ 1in P , for each time interval T . Each physiological measure k k undergoes this process depending on how noisy the data is. ST changes according to blood circulation at the surface of the skin through body tissue (Kamon et al.,1974; Mindﬁeld, 2014). The resultant ST is obtained using equation (4). To bridge the gap between the rate of emotional response in average users compared to most experienced users requires synchronised events in user interaction and the simulation of these physiological response process (Brandt, 1999; Cooley et al., 1999; Widyantoro et al., 1999; Castaneda et al.,2007; Schneider-Hufschmidt et al., 1993). 1.3 Integrating the wireless biosensor to the control system The eye tracker and Q-sensor has wireless capabilities that enables physiological readings and eye movement data to be synchronised and visualised in a system containing the control system application which operates as a standalone. This can be done in real time with the webpages from the eye tracker visualised on the control system’s browser inclined interphase. Optimal response reading from the Q-sensor are mostly seen to correlate to visible spikes and web contents as visualised from the control system. Each user’s reading can be generated using this process in real time (Figure 3). Section 2 describes the experiment set and data collection. IJCS 5,3 Figure 3 Model framework for the integrated systems 2. Method The proposed model involved two modules: the ﬁrst module generates the user attributes from the wireless sensors which includes the SCR sensor (Q-sensor) that also measures ST and the eye tracker (Tobii eye tracker) (Kim et al., 2015; Bixler et al., 2015)(Figure 3), which measures the eye movement behaviour, the second module made predictions on the users’ emotional response from the captured user attributes based on the control system for modelling physiological process of users in reaction to web stimuli. Methods such as the neural network and logistic regression are both predictive models which were used for comparison. Before the data collection commences, the experiment was assigned reference number CS77, which was approved by the University of Manchester Senate Committee on the ethics of research on human beings. A total of 44 participants (12 female, 32 male) were recruited from the workers and regular users of the web group, age between 18 and 48 and above. They were recruited through advertisement and recommendations from the University of Manchester. The study took less than 10 min. The tasks in the study were straightforward and designed in such a way that we can easily detect optimal response and in the manner that users were accustomed to, this included typing words into text-box content and clicking on icons. The participants interacted with static and dynamic web contents by completing six straightforward tasks (Table 1), each of which was designed to encourage interaction with an element. Some of these elements were represented as Stimulus Task Google-Search “Locate Manchester University” Google-Suggest “Locate Manchester University” National-Rail-Enquiries-Search “Look for a train-route from London to Manchester” National-Rail-Enquiries-Suggest “Look for a train-route from London to Manchester” iGoogle “On the CNN.com box, locate news stories” Table 1. “Read the displayed text contents” Task allocated to Yahoo Portal “Locate the entertainment, sports, news or stories” each webpage “Read the displayed text contents” static information, while others were dynamic. The task “search” encouraged the users to Integration of interact with static contents, whereas the task “suggest” encouraged interaction with dynamic biosensor contents such as automated lists (ASL). The participants were sited, each facing a TOBII 1750 eye tracker. The webpage data and users’ eye movements which include ﬁxations, saccades and PD were recorded. The participants placed their two middle right ﬁngers on the wireless SCR Q-sensor [Figure 4(b)], leaving the right hand free to perform tasks such as keystrokes and mouse manipulation. Analysis software embedded with the eye-tracker was used to record the eye movements and ﬁxations. The physiological readings can also be visualised with the control system integrated to the eye tracker, the system wirelessly receive data from the Q-sensor (Figure 3) in synch with the users’ eye movement. The users commenced with the index page, with links to the task allocated webpages for a total time of less than 10 min; interaction with each page was less than 120 s. Data was collected objectively without interrupting data collection, and exported to the control system. 2.1 Window-based dynamic control system For user attributes on data saved from the sensors, the entire system is represented by the expression in equation (6), this is the control system’s model ﬁt to the data with a default prediction focus 4 min from the original emotional stress levels of the user (class labels): dx ðÞ ðÞ ¼ ax t þ bu t (6) dt ðÞ ðÞ ðÞ yt ¼ cx t þ du t þ 4k where y(t) is the response variables (stress levels) that determine coefﬁcient of physiological reactions with computed variables u(t) which represents the data matrix Z , each input m,p variable has p-values less than the default critical point (0.05). The control system is designed in such a way that data from the sensor is imported from the index page (Figure 5). The system consists of two modules: the ﬁrst module computes the user attributes while the second module make predictions from the user attributes. The steps for the algorithm are stated in Isiaka (2017). From the procedure, to compute the user attributes, X is set as a place holder for the physiological data from the SCR/ST sensor and also PD/eye movement from the eye tracker while Y is the place holder for the eye Figure 4 Wireless SCR Q- sensor IJCS 5,3 Figure 5. Index page showing detected peaks and events corresponding to spikes in physiological responses movement data from the Eye tracker sensor. The main aim is to generate a data set Z with m,p m instances and p number of attributes. The ﬁrst step is to set the initial conditions; each participant i = 1: 44 interacts with each webpages j = 1: 6 from which the user attributes are computed and used to update the matrix Z that serve as the secondary data (Figure 6). m,p The correlates of optimal response are used to classify the status of participants for each user attribute the status of participants. “Correlates” here represents events from sensor and eye tracker that occurred at time of optimal response (peaks) of SCR. To execute the steps in the algorithm, each participant’s generated data was considered based on differences in the baseline (b ). For each person, the baseline is different, thus increases in amplitude (a)is i i computed based on a set threshold. As the baseline (b ) for individual users is different, the latency for response time is particularly distinct. The average latency is determined by calculating each delay in a user’s SCR’s amplitude, which involves taking the time readings at points corresponding to minimum index of high tonic phases of the response signal. This determines the delay for each increase in amplitude of SCR. The learning curve (Figure 7)in the model for cross validation is used to visualise the error in the learning process of the model. 3. Results The system integrates physiological readings and eye movement behaviour to produce a single interface where the stress points on the webpages can be seen. For example, a participant felt stressed while looking at ASL on AOI(1) and looking off screen from the Google-suggest page which appeared as blue transparent dots. The result of the spikes in the physiological readings generates an integrated interface with the users’ affect state located on the webpages. The status of the user is derived from the computed secondary data. These computed parameters were obtained from physiological readings that Integration of biosensor Figure 6. Generated secondary data for each participant’s interaction with six webpages IJCS 5,3 Figure 7. Cross-validation error curve for logistic regression, PHYCOB I and neural network from the split with best performance correspond to eye movement and ﬁxations on a webpage. The increase and decrease in amplitude of SCR correspond to user activity. The average peaks, latency and amplitude were computed for the SCR; likewise the mean PD and the mean skin temperature (MST) of users’ responses to the different webpages. The emotional response denoted by stressed, neutral and relaxed mood is indicated by the transparent blue, purple and red spot on the webpages, e.g. a participant experienced stress emotion while looking at ASL on Google page, national rail enquiry page and picture content on yahoo page; a neutral mood is seen on two pages while a relaxed mood is on a Google page. To compare the control system’s model to neural network and logistic regression, each of the model was trained on a sequence of sub-samples and tested on the remaining part of the data with different test splits using the emotional responses as labels for all iterations. Table 2 shows the best accuracy of all training sets. An optimal logistic regression model was selected from runs involving the forward, backward and stepwise models and likewise for the neural network model Integration of on different schematic structures. The choices were based on the cross-validation error plot in biosensor Figure 7. For each split, the training/testing set was used as it indicates the best performance for all splits. The performance result shows that the control system has high accuracy on all sets of simulated data, except at simulated sets M3 and M4 (Table 2), where it shows the worst performance of all the training sets. For test sets with high performance models, the average number of predicted emotional stress levels with the true and false positive class for the three models is shown in Table 3 below. The predicted classes were obtained by projecting each model training set on the test set (new data), i.e. 30% of the original data. The optimal models for the logistic regression and neural network with the least errors are then compared to the control system’s best output. The performance of system using 70% training gives a signiﬁcant performance of 0.90. This implies that the model learns more with a higher number of training sets than test sets compared to the other models. Even with high test sets, there is still an indication of highly signiﬁcant performance and the model seems to resist overﬁtting owing to regularisation by taking a smooth function of the variables. The cross- validation error of the training/test set for logistic regression gives the least error between 0.08 and 0.13 at the 54th iteration while the error training/testing set for the neural network model gives the least error of 0.14 at the 59th iteration. The learning curve shows that the model error decreases as the number of training sets increases. The variables that were optimal for each of the forward, backward and stepwise methods for logistic regression indicate the “mean peak” (MPeak), “MST” and “MappedFX” (eye movement) of the webpages as being the best Emotional response Sim data PHYCOB I Logistic R Neural network Test set Test set Test set 10% 30% 50% 75% 10% 30% 50% 75% 10% 30% 50% 75% Performance M1 0.86 0.90 0.84 0.84 0.80 0.85 0.85 0.81 0.79 0.86 0 67 0.61 Table 2. M2 0.70 0.87 0.79 0.84 0.85 0.84 0.81 0.81 0.85 0.78 0.80 0.85 Performance of M3 0.33 0.63 0.43 0.47 0.66 0.61 0.54 0.58 0.74 0.58 0.53 0.60 models on multiple M4 0.40 0.61 0.58 0.53 0.16 0.49 0.51 0.51 0.20 0.49 0.51 0.52 simulations M5 0.60 0.59 0.65 0.66 0.56 0.60 0.59 0.50 0.65 0.63 0.61 0.53 Models Predicted class of emotional response Neutral Relaxed Stressed PHYCOB I Actual Neutral 24 0 1 Table 3. Relaxed 1 19 0 Matrix table Stressed 0 0 78 indicating average Logistic R Actual Neutral 24 0 1 number of Relaxed 2 18 0 true positive and Stressed 0 0 78 false positive Neural network Actual Neutral 23 1 1 predicted classes for Relaxed 1 18 1 Stressed 1 0 77 the three models parameters for the optimal model with p-values less than the critical value of 0.05 while for IJCS neural network the best features include MPeak, MST and SaccadeSize (Saccade size). 5,3 4. Conclusion This paper focuses on integration of biosensor to a window-based dynamic control system for detecting user emotional responses to web contents which were used has stimuli. The system can also serve as secondary indicator of user stress level based on individual interaction with each webpage. The novel approach we adopted was to determine the physiological correlates of user interaction to webpages by developing an algorithm that also serves as tertiary indicator of stress level in users. To implement the control system, we ﬁrst conducted an experiment in both real time and with a delay. In real time, it involved participants who are familiar with surﬁng the web. Data was exported from sensors (SCR and eye tracking) that measure the physiological response of users, including the SCR, ST, eye movement (ﬁxations, saccade, PD), while they interact with six webpages. The system then computes the physiological attributes. In delay, it reads in each individual data and computes the physiological parameters mentioned from the readings, which help to group the emotional stress status of users to each webpage. It also identiﬁes which physiological attributes have the most effect on user interaction to dynamic and static contents; this happens to be the average peak in the SCR of users (MPeak), MST and mapped ﬁxation on the x-coordinate of the webpages (MappedFX). These attributes were also conﬁrmed in other standard techniques. To test the model’s reliability and signiﬁcance, we compared the model to other standard techniques such as neural network and logistic regression. This paper opens the way to possible beneﬁts in terms of predicting human behaviour in respect to the visual experience and internet security by using the tool as an alarm trigger for sending alerts on unauthorised access or abnormal activities online. A control link can also be generated to provide instant access to data visualisation at time of physiological response generation. References Andreassi, J.L. (2000), Psychophysiology: Human Behavior and Physiological Response, Psychology Press, p. 12. Bergstrom, J.R. and Schall, A. (2014), Eye Tracking in User Experience Design, Elsevier. 10-38. Bixler, R., Blanchard, N., Garrison, L. and D’Mello, S. (2015), “Automatic detection of mind wandering during reading using gaze and physiology”, Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, pp. 299-306. Brandt, M.L., Brown, K.E., Dykes, P.J., Lindberg, E.D., Olson, D.E., Selden, J.E., Snyder, D.D. and Walts, J.O. (1999), “Computer apparatus and method for providing a common user interface for software applications accessed via the world-wide web”, US Patent, Vol. 5, pp. 892-905. Castaneda, J.A., Munoz-Leiva, F. and Luque, T. (2007), “Web acceptance model (WAM): moderating effects of user experience”, Information and Management, Vol. 44 No. 4, pp. 384-396. Chen, H., Wigand, R.T. and Nilan, M. (2000), “Exploring web users’ optimal ﬂow experiences”, Information Technology and People, Vol. 13 No. 4, pp. 263-281. Cooley, R., Mobasher, B. and Srivastava, J. (1999), “Data preparation for mining world wide web browsing patterns”, Knowledge and Information Systems, Vol. 1 No. 1, pp. 5-32. Davis, E.V. (1990), “Software testing for evolutionary iterative rapid prototyping”, Ph.D. thesis, Naval Postgraduate School, Monterey, CA, 19. Filipovic, S.R. and Andreassi, J.L. (2001), “Psychophysiology: human behavior and physiological response”, Journal of Psychophysiology, Vol. 15 No. 3, pp. 210-212. Gross, J.J., Carstensen, L.L., Pasupathi, M., Tsai, J., Gotestam Skorpen, C. and Hsu, A.Y. (1997), Integration of “Emotion and aging: experience, expression, and control”, Psychology and Aging, Vol. 12 No. 4, biosensor p. 590. Healey, J.A. and Picard, R.W. (2005), “Detecting stress during real-world driving tasks using physiological sensors”, Ieee Transactions on Intelligent Transportation Systems, Vol. 6 No. 2, pp. 156-166. Iqbal, S.T., Zheng, X.S. and Bailey, B.P. (2004), “Task-evoked pupillary response to mental workload in human-computer interaction”, In CHI’04 Extended Abstracts on Human Factors in Computing Systems, ACM, Vol. 14, pp. 1477-1480. Isiaka, F., (2017), “Modelling stress levels based on physiological responses to web contents”, Shefﬁeld Hallam University. Kamon, E., Pandolf, K. and Cafarelli, E. (1974), “The relationship between perceptual information and physiological responses to exercise in the heat”, Journal of Human Ergology, Vol. 3 No. 1, pp. 45-54. Kim, E.S., Kyoung-Woon, O. and and Byoung-Tak, Z. (2015), “Behavioural pattern modelling of human- human interaction for teaching restaurant service robots”, AAAI 2015 Fall Symposium on AI for Human-Robot Interaction, pp. 45-89. Kolakowska, A., Landowska, A., Szwoch, M., Szwoch, W. and Wrobel, M.R. (2013), “Emotion recognition and its application in software engineering”, Human System Interaction (HSI), 2013 The 6th International Conference, IEEE, 532-539. Mindﬁeld (2014), “eSense temperature, mindﬁeld biofeedback systems”, eSence Skin Temperature Handbook, pp. 1-12. Nielsen, J. (1994), “Usability inspection methods”, Conference companion on Human factors in computing systems, ACM, pp. 413-414. Nielsen, J. and Molich, R. (1990), “Heuristic evaluation of user interfaces”, Proceedings of the SIGCHI conference on Human factors in computing systems, ACM, pp. 249-256. Saint-Aime, S., Le Pevedic, B. and Duhaut, D. (2009), “iGrace-emotional computational model for emi companion robot”, Advances in Human-Robot Interaction, Vol. 4, pp. 26. Sauer, J. and Sonderegger, A. (2009), “The inﬂuence of prototype ﬁdelity and aesthetics of design in usability tests: effects on user behaviour, subjective evaluation and emotion”, Applied Ergonomics, Vol. 40 No. 4, pp. 670-677. Schneider-Hufschmidt, M., Malinowski, U. and Kuhme, T. (1993), Adaptive User Interfaces: Principles and Practice, Elsevier Science Inc, p. 22. Skadberg, Y.X. and Kimmel, J.R. (2004), “Visitors’ ﬂow experience while browsing a web site: its measurement, contributing factors and consequences”, Computers in Human Behavior, Vol. 20 No. 3, pp. 403-422. Smith, M.J., Conway, F.T. and Karsh, B.T. (1999), “Occupational stress in human computer interaction”, Industrial Health, Vol. 37 No. 2, pp. 157-173. Ugur, S. (2013), Wearing Embodied Emotions: A Practice Based Design Research on Wearable Technology, Springer, p. 4. Vasalou, A., Ng, B., Wiemer-Hastings, P. and Oshlyansky, L. (2004), “Human-moderated remote user testing: protocols and applications”, 8th ERCIM Workshop, User Interfaces for All, Wien, p. 19. Widyantoro, D.H., Ioerger, T.R. and Yen, J. (1999), “An adaptive algorithm for learning changes in user interests”, Proceedings of the eighth international conference on Information and knowledge management, ACM, pp. 405-412. Zhai, J. and Barreto, A. (2006), “Stress detection in computer users based on digital signal processing of noninvasive physiological variables”, Engineering in Medicine and Biology Society, 28th Annual International Conference of the IEEE, IEEE. pp. 1355-1358. Further reading IJCS Hackett, S., Parmanto, B. and Zeng, X. (2005), “A retrospective look at website accessibility over time”, 5,3 Behaviour and Information Technology, Vol. 24 No. 6, pp. 407-417. Harbrecht, H., Peters, M. and Schneider, R. (2012), “On the low-rank approximation by the pivoted Cholesky decomposition”, Applied Numerical Mathematics, Vol. 62 No. 4, pp. 428-440. Mandryk, R.L., Inkpen, K.M. and Calvert, T.W. (2006), “Using psychophysiological techniques to measure user experience with entertainment technologies”, Behaviour and Information Technology, Vol. 25 No. 2, pp. 141-158. Vermeeren, A.P., Law, E.L.C., Roto, V., Obrist, M., Hoonhout, J. and Vaananen-Vainio-Mattila, K. (2010), “User experience evaluation methods: current state and development needs”,In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, ACM. pp. 521-530. Verschuere, B., Ben-Shakhar, G. and Meijer, E. (2011), Memory Detection: Theory and Application of the Concealed Information Test, Cambridge University Press. P. 22. Corresponding author Fatima M. Isiaka can be contacted at: email@example.com For instructions on how to order reprints of this article, please visit our website: www.emeraldgrouppublishing.com/licensing/reprints.htm Or contact us for further details: firstname.lastname@example.org
International Journal of Crowd Science – Emerald Publishing
Published: Nov 22, 2021
Keywords: Dynamic control system; Eye tracker sensors; Human physiological functions; Online behaviour; Skin conductance response; Skin temperature
Access the full text.
Sign up today, get DeepDyve free for 14 days.