Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 30578


Select areas to restrict search in scientific publication database:
10011258
Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow
Abstract:
Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.
Digital Object Identifier (DOI):

References:

[1] M. J. Eliason and L. C. Richman, “The Continuous Performance Test in learning disabled and nondisabled children,” J. Learn. Disabil., vol. 20, no. 10, pp. 614–619, 1987.
[2] E. Bloom and N. Heath, “Recognition, expression, and understanding facial expressions of emotion in adolescents with nonverbal and general learning disabilities,” J. Learn. Disabil., vol. 43, no. 2, pp. 180–192, Mar. 2010.
[3] V. L. Petti, S. L. Voelker, D. L. Shore, and S. E. Hayman-Abello, “Perception of Nonverbal Emotion Cues by Children with Nonverbal Learning Disabilities,” J. Dev. Phys. Disabil., vol. 15, no. 1, pp. 23–36, 2003.
[4] H. B. Holder and S. W. Kirkpatrick, “Interpretation of emotion from facial expressions in children with and without learning disabilities,” J. Learn. Disabil., vol. 24, no. 3, pp. 170–177, Mar. 1991.
[5] L. Weigel, P. E. Langdon, S. Collins, and Y. O’Brien, “Challenging behaviour and learning disabilities: The relationship between expressed emotion and staff attributions,” Br. J. Clin. Psychol., vol. 45, no. 2, pp. 205–216, 2006.
[6] F. Heider, The psychology of interpersonal relations. Hoboken: John Wiley & Sons Inc, 1958.
[7] D. Dagnan, P. Trower, and R. Smith, “Care staff responses to people with learning disabilities and challenging behaviour: A cognitive-emotional analysis,” Br. J. Clin. Psychol., vol. 37, no. 1, pp. 59–68, Feb. 1998.
[8] R. Sharrock, A. Day, F. Qazi, and C. R. Brewin, “Explanations by professional care staff, optimism and helping behaviour: an application of attribution theory,” Psychol. Med., vol. 20, no. 4, pp. 849–55, Nov. 1990.
[9] “SSATrust - Engagement profile & scale.” (Online). Available: http://complexld.ssatrust.org.uk/project-resources/engagement-profile-scale.html.
[10] B. Carpenter, “Engaging young people with complex learning difficulties and disabilities in learning,” no. Cldd.
[11] J. P. Roscoe, “Can the use of a humanoid robot enhance the learning of children with learning disabilities?” University of Nottingham, 2014.
[12] P. Standen et al., “Engaging Students with Profound and Multiple Disabilities Using Humanoid Robots,” Springer, Cham, 2014, pp. 419–430.
[13] T. Hughes-Roberts et al., “Examining engagement and achievement in learners with individual needs through robotic-based teaching sessions,” Br. J. Educ. Technol., vol. 50, no. 5, pp. 2736–2750, Sep. 2019.
[14] J. I. Marcum, “A statistical theory of target detection by pulsed radar,” Dec. 1947.
[15] L. Swanson, “Vigilance deficit in learning disabled children: a signal detection analysis,” J. Child Psychol. Psychiatry., vol. 22, no. 4, pp. 393–399, Oct. 1981.
[16] A. Lanatà, G. Valenza, and E. P. Scilingo, “Eye gaze patterns in emotional pictures,” J. Ambient Intell. Humaniz. Comput., vol. 4, no. 6, pp. 705–715, Dec. 2013.
[17] R. N. Khushaba, C. Wise, S. Kodagoda, J. Louviere, B. E. Kahn, and C. Townsend, “Consumer neuroscience: Assessing the brain response to marketing stimuli using electroencephalogram (EEG) and eye tracking,” Expert Syst. Appl., vol. 40, no. 9, pp. 3803–3812, Jul. 2013.
[18] R. W. Picard, “Affective Computing,” MIT Press, no. 321, pp. 1–16, 1995.
[19] M. Pantic, A. Nijholt, A. Pentland, and T. S. Huang, “Human-Centred Intelligent Human-Computer Interaction (HCI^2): how far are we from attaining it?,” Int. J. Auton. Adapt. Commun. Syst., vol. 1, no. 2, pp. 168–187, 2008.
[20] Z. Zeng, M. Pantic, G. I. Roisman, et al, and T. S. Huang, “A survey of affect recognition methods: audio, visual, and spontaneous expressions,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 31, no. 1, pp. 39–58, Jan. 2009.
[21] M. Pantic, A. Pentland, A. Nijholt, and T. Huang, “Human Computing and Machine Understanding of Human Behavior: A Survey,” 2006.
[22] Beckman Institute, “Multimodal Human Computer Interaction towards a Proactive Computer,” 2001.
[23] R. Ishii, Y. Shinohara, I. Nakano, and T. Nishida, “Combining Multiple Types of Eye-gaze Information to Predict User’s Conversational Engagement,” Hum. Factors, pp. 1–8, 2011.
[24] A. Jaimes and N. Sebe, “Multimodal human–computer interaction: A survey,” Comput. Vis. Image Underst., vol. 108, no. 1–2, pp. 116–134, Oct. 2007.
[25] M. Pantic, N. Sebe, J. F. Cohn, and T. Huang, “Affective Multimodal Human-Computer Interaction,” Proc. 13th Annu. ACM Int. Conf. Multimed., pp. 669–676, 2005.
[26] R. Sharma, V. I. Pavlovic, and T. S. Huang, “Toward multimodal human-computer interface,” Proc. IEEE, vol. 86, no. 5, pp. 853–869, May 1998.
[27] A. Kleinsmith and N. Bianchi-Berthouze, “Recognizing Affective Dimensions from Body Posture,” in Affective Computing and Intelligent Interaction, 2007, pp. 48–58.
[28] N. Savva, A. Scarinzi, and N. Bianchi-Berthouze, “Continuous recognition of player’s affective body expression as dynamic quality of aesthetic experience,” IEEE Trans. Comput. Intell. AI Games, vol. 4, no. 3, pp. 199–212, 2012.
[29] S. Asteriadis, K. Karpouzis, and S. Kollias, “The Importance of Eye Gaze and Head Pose to Estimating Levels of Attention,” 2011 Third Int. Conf. Games Virtual Worlds Serious Appl., no. September 2015, pp. 186–191, 2011.
[30] Z. Zeng, J. Tu, B. M. Pianfetti, and T. S. Huang, “Audio-visual affective expression recognition through multistream fused HMM,” IEEE Trans. Multimed., vol. 10, no. 4, pp. 570–577, 2008.
[31] N. Bianchi-Berthouze, W. W. Kim, and D. Patel, “Does Body Movement Engage You More in Digital Game Play? and Why?,” in Affective Computing and Intelligent Interaction, 2007, pp. 102–113.
[32] S. D’Mello and A. Graesser, “Dynamics of affective states during complex learning,” Learn. Instr., vol. 22, no. 2, pp. 145–157, 2012.
[33] M. Csikszentmihalyi and M. Csikzentmihaly, “Flow: The psychology of optimal experience,” vol. 41, no. May, 1991.
[34] M. Csikszentmihalyi, “The Flow of Creativity,” Creat. Flow Psychol. Discov. Invent., pp. 107–126, 1996.
[35] R. C. Wilson, A. Shenhav, M. Straccia, and J. D. Cohen, “The Eighty Five Percent Rule for optimal learning,” Nat. Commun., vol. 10, no. 1, pp. 1–9, 2019.
[36] M. Taheri, D. Brown, P. Standen, N. Sherkat, G. Cosma, and C. Langensiepen, “State Diagram for Affective Learning in an Educational Platform,” pp. 4–6, 2018.
[37] M. Csikszentmihalyi, “Creativity: Invention, Flow and the psychology of discovery and Invention,” in Personnel Psychology, vol. 51, no. 3, 1997, p. 16.
[38] P. H. Mirvis and M. Csikszentmihalyi, “Flow: The Psychology of Optimal Experience,” Acad. Manag. Rev., vol. 16, no. 3, p. 636, 1991.
[39] G. Chanel, C. Rebetez, M. Bétrancourt, and T. Pun, “Boredom, engagement and anxiety as indicators for adaptation to difficulty in games,” no. January 2008, p. 13, 2008.
[40] M. Taheri, D. Brown, P. Standen, N. Sherkat, G. Cosma, and H. Street, “State Diagram for Affective Learning in an Educational Platform,” pp. 4–6, 2018.
[41] A. R. Basawapatna, A. Repenning, K. H. Koh, and H. Nickerson, “The zones of proximal flow,” Proc. ninth Annu. Int. ACM Conf. Int. Comput. Educ. Res. - ICER ’13, p. 67, 2013.
[42] B. Carpenter, S. Rose, H. Rawson, and J. Egerton, “The rules of engagement,” SEN Mag., no. 54, pp. 34–54, 2011.
[43] B. Carpenter et al., Engaging Learners with Complex Learning Difficulties and Disabilities: A resource book for teachers and teaching assistants. Taylor & Francis, 2015.
[44] R. Iovannone, G. Dunlap, H. Huber, and D. Kincaid, “Effective Educational Practices for Students with Autism Spectrum Disorders,” Focus Autism Other Dev. Disabil., vol. 18, pp. 150–165, 2003.
[45] M. J. Lee and B. Ferwerda, “Personalizing online educational tools,” Humaniz. 2017 - Proc. 2017 ACM Work. Theory-Informed User Model. Tailoring Pers. Interfaces, co-located with IUI 2017, pp. 27–30, 2017.
[46] C. R. Henrie, L. R. Halverson, and C. R. Graham, “Measuring student engagement in technology-mediated learning: A review,” Comput. Educ., vol. 90, pp. 36–53, 2015.
[47] A. K. Wiggins. J. B. Grafsgaard. J. F. Boyer. K. E. E. N. Lester. J. C. Vail, “The Affective Impact of Tutor Questions: Predicting Frustration and Engagement,” Int. Educ. Data Min. Soc., 2016.
[48] S. Aslan et al., “Investigating the Impact of a Real-time, Multimodal Student Engagement Analytics Technology in Authentic Classrooms,” in Conference on Human Factors in Computing Systems - Proceedings, 2019.
[49] J. Hamari, D. J. Shernoff, E. Rowe, B. Coller, J. Asbell-Clarke, and T. Edwards, “Challenging games help students learn: An empirical study on engagement, flow and immersion in game-based learning,” Comput. Human Behav., vol. 54, pp. 170–179, 2016.
[50] P. H. Mirvis, “Flow: The Psychology of Optimal Experience,” Acad. Manag. Rev., vol. 16, no. 3, pp. 636–640, 1991.
[51] “Engaging children with complex learning difficulties and disabilities in the Primary Classroom. Barry Carpenter,” no. 2002, 2011.
[52] J. C. Abrams, “On learning disabilities: Affective considerations,” J. Reading, Writing, Learn. Disabil. Int., vol. 2, no. 3, p. 190, Jan. 1986.
[53] B. Hiebert, B. Y. Wong, and M. Hunter, “Affective influences on learning disabled adolescents,” Learn. Disabil. Q., vol. 5, no. 4, pp. 334–343, 1982.
[54] R. R. McCrae, “Consensual validation of personality traits: Evidence from self-reports and ratings,” J. Pers. Soc. Psychol., vol. 43, no. 2, pp. 293–303, 1982.
[55] D. Watson and L. A. Clark, “Self- Versus Peer Ratings of Specific Emotional Traits: Evidence of Convergent and Discriminant Validity,” J. Pers. Soc. Psychol., vol. 60, no. 6, pp. 927–940, 1991.
[56] J. S. Warm, R. Parasuraman, and G. Matthews, “Vigilance requires hard mental work and is stressful,” Hum. Factors, vol. 50, no. 3, pp. 433–41, Jun. 2008.
[57] G. J. DuPaul, A. D. Anastopoulos, T. L. Shelton, D. C. Guevremont, and L. Metevia, “Multimethod Assessment of Attention-Deficit Hyperactivity Disorder: The Diagnostic Utility of Clinic-Based Tests,” J. Clin. Child Psychol., vol. 21, no. 4, pp. 394–402, Dec. 1992.
[58] H. L. Swanson, “A developmental study of vigilance in learning-disabled and nondisabled children,” J. Abnorm. Child Psychol., vol. 11, no. 3, pp. 415–29, Sep. 1983.
[59] W. Peterson, T. Birdsall, and W. Fox, “The theory of signal detectability,” Trans. IRE Prof. Gr. Inf. Theory, vol. 4, no. 4, pp. 171–212, Sep. 1954.
[60] N. A. Macmillan and C. D. Creelman, Detection Theory: A User’s Guide. Psychology Press, 1990.
[61] J. A. Swets and R. M. Pickett, Evaluation of diagnostic systems : methods from signal detection theory. Academic Press, 1982.
[62] W. P. Tanner and J. A. Swets, “A decision-making theory of visual detection,” Psychol. Rev., vol. 61, no. 6, pp. 401–409, 1954.
[63] H. E. Rosvold, A. F. Mirsky, I. Sarason, E. D. Bransome, and L. H. Beck, “A continuous performance test of brain damage.,” J. Consult. Psychol., vol. 20, no. 5, pp. 343–350, 1956.
[64] D. H. Sykes, V. I. Douglas, G. Weiss, and K. K. Minde, “Attention in hyperactive children and the effects methylphenidaten (Ritalin),” J. child Psychol. psychiatry, vol. 12, no. 2, pp. 129–139, 1971.
[65] “Be first to get a Tobii EyeX Dev Kit Want to develop for Android or Linux ? Get a,” vol. 595, p. 795.
[66] I. Software, “Muse : The Brain Sensing Headband Tech Spec Sheet.”
[67] “MATLAB - The Language of Technical Computing - MathWorks United Kingdom.” (Online). Available: http://uk.mathworks.com/products/matlab/. (Accessed: 07-Mar-2016).
[68] M. Teplan, “Fundamentals of EEG measurement,” Meas. Sci. Rev., vol. 2, pp. 1–11, 2002.
[69] “Tobii EyeX Controller – get your own eye tracker.” 30-Jul-2015.
[70] “An introduction to Tobii EyeX | Tobii Developer Zone.” (Online). Available: http://developer.tobii.com/an-introduction-to-the-tobii-eyex-sdk/. (Accessed: 03-Mar-2016)].
[71] “Specifications for EyeX – Tobii Eye Tracking Support.” (Online). Available: https://help.tobii.com/hc/en-us/articles/212818309-Specifications-for-EyeX. (Accessed: 09-Apr-2020).
[72] “Kinect hardware.” (Online). Available: https://dev.windows.com/en-us/kinect/hardware. (Accessed: 03-Mar-2016).
[73] Q. Wang, Q. Wang, and G. Kurillo, “Evaluation of Pose Tracking Accuracy in the First and Second Generations of Microsoft Kinect Evaluation of Pose Tracking Accuracy in the First and Second Generations of Microsoft Kinect,” no. November, 2015.
[74] M. Rosenberg, S. Noonan, J. Degutis, and M. Esterman, “Sustaining visual attention in the face of distraction: a novel gradual-onset continuous performance task,” Atten. Percept. Psychophys., vol. 75, no. 3, pp. 426–39, 2013.
[75] C. Riccio, “The continuous performance test: a window on the neural substrates for attention?,” Arch. Clin. Neuropsychol., vol. 17, no. 3, pp. 235–272, Apr. 2002.
[76] S. H. Klee and B. D. Garfinkel, “The computerized continuous performance task: a new measure of inattention.,” J. Abnorm. Child Psychol., vol. 11, no. 4, pp. 487–495, 1983.
[77] C. K. Conners, “The computerized continuous performance test,” Psychopharmacol. Bull., vol. 21, no. 4, pp. 891–2, Jan. 1985.
[78] C. A. Riccio, J. J. M. Waldrop, C. R. Reynolds, and P. Lowe, “Effects of stimulants on the continuous performance test (CPT): implications for CPT use and interpretation,” J. Neuropsychiatry Clin. Neurosci., vol. 13, no. 3, pp. 326–335, Aug. 2001.
[79] “Teacher assessment: using P scales to report on pupils with SEN - Detailed guidance - GOV.UK.” (Online). Available: https://www.gov.uk/guidance/teacher-assessment-using-p-scales. (Accessed: 07-Mar-2016).
[80] “P scales: attainment targets for pupils with SEN - Publications - GOV.UK,” Department for Education. (Online). Available: https://www.gov.uk/government/publications/p-scales-attainment-targets-for-pupils-with-sen. (Accessed: 07-Mar-2016).
[81] T. Affective and O. F. E. Emotion, “Brain – Computer Interface: Fundamentals and Analysis of EEG-Based Emotion Classification.”
[82] A. Bulling and T. O. Zander, “Cognition-Aware Computing,” IEEE Pervasive Comput., vol. 13, no. 3, pp. 80–83, 2014.
[83] M. Akcakaya et al., “Noninvasive brain-computer interfaces for augmentative and alternative communication,” IEEE Rev. Biomed. Eng., vol. 7, no. c, pp. 31–49, 2014.
[84] J. R. Wolpaw, N. Birbaumer, D. J. McFarland, G. Pfurtscheller, and T. M. Vaughan, “Brain-computer interfaces for communication and control,” Clin. Neurophysiol., vol. 113, no. 6, pp. 767–91, Jun. 2002.
[85] M. Ahsan, M. Ibrahimy, and O. Khalifa, “EMG signal classification for human computer interaction: a review,” Eur. J. Sci. …, vol. 33, no. 3, pp. 480–501, 2009.
[86] I. Daly et al., “On the control of brain-computer interfaces by users with cerebral palsy,” Clin. Neurophysiol., vol. 124, no. 9, pp. 1787–1797, 2013.
[87] J. R. Wolpaw and D. J. McFarland, “Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans.,” Proc. Natl. Acad. Sci. U. S. A., vol. 101, no. 51, pp. 17849–54, 2004.
[88] T. O. Zander and C. Kothe, “Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general.,” J. Neural Eng., vol. 8, no. 2, p. 025005, Apr. 2011.
[89] E. Haselsteiner and G. Pfurtscheller, “Using time-dependent neural networks for EEG classification,” IEEE Trans. Rehabil. Eng., vol. 8, no. 4, pp. 457–63, Dec. 2000.
[90] Y. P. Lin et al., “EEG-based emotion recognition in music listening,” IEEE Trans. Biomed. Eng., vol. 57, no. 7, pp. 1798–1806, 2010.
[91] D. P.-O. O. Bos et al., “Human-computer interaction for BCI games: Usability and user experience,” Proc. - 2010 Int. Conf. Cyberworlds, CW 2010, pp. 277–281, Oct. 2010.
[92] C. Mühl et al., “Bacteria Hunt,” J. Multimodal User Interfaces, vol. 4, no. 1, pp. 11–25, Aug. 2010.
[93] C. Mühl et al., “Bacteria Hunt: A multimodal, multiparadigm BCI game,” enterface2009, no. 1, pp. 1–22, 2009.
[94]
[94] T. Egner and M. B. Sterman, “Neurofeedback treatment of epilepsy: from basic rationale to practical application.,” Expert Rev. Neurother., vol. 6, no. 2, pp. 247–57, Feb. 2006.
[95] O. M. Bazanova and D. Vernon, “Interpreting EEG alpha activity,” Neurosci. Biobehav. Rev., vol. 44, pp. 94–110, 2013.
[96] M. Adjounadi, M. Cabrerizo, I. Yaylali, and P. Jayakar, “Interpreting EEG functional brain activity,” IEEE Potentials, vol. 23, no. 1, pp. 8–13, 2004.
[97] Trans Cranial Technologies Ltd., “10 / 20 System Positioning Manual,” p. 20, 2012.
[98] A. Kubler, E. Holz, T. Kaufmann, and C. Zickler, “A User Centred Approach for Bringing BCI Controlled Applications to End-Users,” in Brain-Computer Interface Systems - Recent Progress and Future Prospects, 2013.
[99] V. P. Oikonomou, a T. Tzallas, and D. I. Fotiadis, “A Kalman filter based methodology for EEG spike enhancement,” Comput. Methods Programs Biomed., vol. 85, no. 2, pp. 101–8, Feb. 2007.
[100] F. Morbidi, S. Member, A. Garulli, S. Member, C. Rizzo, and S. Rossi, “Application of Kalman filter to remove TMS-induced artifacts from EEG recordings,” pp. 1–7.
[101] M. Arnold, W. H. Miltner, H. Witte, R. Bauer, and C. Braun, “Adaptive AR modeling of nonstationary time series by means of Kalman filtering.,” IEEE Trans. Biomed. Eng., vol. 45, no. 5, pp. 553–62, May 1998.
[102] A. Schlögl, F. Lee, H. Bischof, and G. Pfurtscheller, “Characterization of four-class motor imagery EEG data for the BCI-competition 2005.,” J. Neural Eng., vol. 2, no. 4, pp. L14-22, Dec. 2005.
[103] V. P. Oikonomou, A. T. Tzallas, S. Konitsiotis, D. G. Tsalikakis, and D. I. Fotiadis, “The Use of Kalman Filter in Biomedical Signal Processing,” no. April, 2009.
[104] G. Pfurtscheller, C. Neuper, a Schlögl, and K. Lugger, “Separability of EEG signals recorded during right and left motor imagery using adaptive autoregressive parameters.,” IEEE Trans. Rehabil. Eng., vol. 6, no. 3, pp. 316–25, Sep. 1998.
[105] L. Trejo et al., “EEG-based estimation of mental fatigue: Convergent evidence for a three-state model,” Found. Augment. Cogn., pp. 201–211, 2007.
[106] A. Gevins and M. E. Smith, “Detecting transient cognitive impairment with EEG pattern recognition methods,” Aviat. Sp. Environ. Med., vol. 70, no. 10, pp. 1018–1024, Oct. 1999.
[107] T. C. Hankins and G. F. Wilson, “A comparison of heart rate, eye activity, EEG and subjective measures of pilot mental workload during flight,” Aviat. Space. Environ. Med., vol. 69, no. 4, pp. 360–7, Apr. 1998.
[108] D. Ming et al., “Electroencephalograph (EEG) signal processing method of motor imaginary potential for attention level classification,” Proc. 31st Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. Eng. Futur. Biomed. EMBC 2009, vol. 2009, pp. 4347–4351, Jan. 2009.
[109] N. H. Liu, C. Y. Chiang, and H. C. Chu, “Recognizing the degree of human attention using EEG signals from mobile sensors,” Sensors (Switzerland), 2013.
[110] T. McMahan, I. Parberry, and T. D. Parsons, “Evaluating Player Task Engagement and Arousal Using Electroencephalography,” Procedia Manuf., vol. 3, no. Ahfe, pp. 2303–2310, 2015.
[111] A. Ameera, A. Saidatul, and Z. Ibrahim, “Analysis of EEG Spectrum Bands Using Power Spectral Density for Pleasure and Displeasure State,” IOP Conf. Ser. Mater. Sci. Eng., vol. 557, no. 1, 2019.
[112] A. V. Oppenheim, R. W. Schafer, and J. R. Buck, Discrete-Time Signal Processing, Second. Pearson, 1999.
[113] J. Musson and J. Li, “A Comparative Survey of PSD Estimation Methods for EEG Signal Analysis,” Student Capstone Conf. Proc., pp. 1–6, 2010.
[114] N. Bianchi-Berthouze, “The Affective Body Argument in Technology Design,” in Proceedings of the International Working Conference on Advanced Visual Interfaces - AVI ’16, 2016, pp. 3–6.
[115] A. Kleinsmith and N. Bianchi-Berthouze, “Recognizing Affective Dimensions from Body Posture,” in Affective Computing and Intelligent Interaction, 2007, pp. 48–58.
[116] J. Farley, E. F. Risko, and A. Kingstone, “Everyday attention and lecture retention: The effects of time, fidgeting, and mind wandering,” Front. Psychol., vol. 4, no. SEP, p. 619, Sep. 2013.
[117] M. D. Smith and M. S. Barrett, “The Effect of Parent Training on Hyperactivity and Inattention in Three School-Aged Girls with Attention Deficit Hyperactivity Disorder,” Child Fam. Behav. Ther., vol. 24, no. 3, pp. 21–35, Oct. 2002.
[118] C. K. Conners, L. Eisenberg, and A. Barcai, “Effect of Dextroamphetamine on Children,” Arch. Gen. Psychiatry, vol. 17, no. 4, p. 478, Oct. 1967.
[119] P. L. Holborow and P. S. Berry, “Hyperactivity and Learning Difficulties,” J. Learn. Disabil., vol. 19, no. 7, pp. 426–431, Aug. 1986.
[120] E. Benedetto-Nasho and R. Tannock, “Math computation, error patterns and stimulant effects in children with Attention Deficit Hyperactivity Disorder,” J. Atten. Disord., vol. 3, no. 3, pp. 121–134, Oct. 1999.
[121] S. L. Smith, No easy answer : the learning disabled child at home and at school. Bantam Books, 2012.
[122] B. Fenesi, K. Lucibello, J. A. Kim, and J. J. Heisz, “Sweat So You Don’t Forget: Exercise Breaks During a University Lecture Increase On-Task Attention and Learning,” J. Appl. Res. Mem. Cogn., vol. 7, no. 2, pp. 261–269, 2018.
[123] S. Lis, N. Baer, C. Stein-En-Nosse, B. Gallhofer, G. Sammer, and P. Kirsch, “Objective measurement of motor activity during cognitive performance in adults with attention-deficit/hyperactivity disorder,” Acta Psychiatr. Scand., vol. 122, no. 4, pp. 285–294, 2010.
[124] A. M. F. Zinno, G. Douglas, S. Houghton, V. Lawrence, J. West, and K. Whiting, “Hyperactivity Disorder (ADHD) during computer video game play,” Br. J. Educ. Technol., vol. 32, no. 5, pp. 607–618, 2001.
[125] N. Bianchi-Berthouze and Nadia Bianchi-Berthouze, “Understanding the role of body movement in player engagement,” Human-Computer Interact., vol. 28, no. 1, pp. 40–75, 2013.
[126] J. A. Vigo, “Eye Gaze Tracking for Tracking Reading Progress,” p. 100, 2013.
[127] P. Fattahi, “Eyetracking Definitions,” 2011.
[128] L. Hoste, B. Dumas, and B. Signer, “Mudra: A Unified Multimodal Interaction Framework,” 2011, pp. 97–104.
[129] H. Zhang, Y. Zhu, J. Maniyeri, and C. Guan, “Detection of variations in cognitive workload using multi-modality physiological sensors and a large margin unbiased regression machine.,” Conf. Proc. ... Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. IEEE Eng. Med. Biol. Soc. Annu. Conf., vol. 2014, pp. 2985–8, Jan. 2014.
[130] K. R. Gray, P. Aljabar, R. A. Heckemann, A. Hammers, and D. Rueckert, “Random forest-based similarity measures for multi-modal classification of Alzheimer’s disease,” Neuroimage, vol. 65, pp. 167–175, 2013.
[131] S. Hor and M. Moradi, “Scandent Tree: A Random Forest Learning Method for Incomplete Multimodal Datasets,” Springer, Cham, 2015, pp. 694–701.
[132] “Mobile Pedagogical Assistant to develop meaningful pathways to personalised learning.” (Online). Available: https://pedagogics-pathway.eu/. (Accessed: 09-Apr-2020).
Vol:14 No:08 2020Vol:14 No:07 2020Vol:14 No:06 2020Vol:14 No:05 2020Vol:14 No:04 2020Vol:14 No:03 2020Vol:14 No:02 2020Vol:14 No:01 2020
Vol:13 No:12 2019Vol:13 No:11 2019Vol:13 No:10 2019Vol:13 No:09 2019Vol:13 No:08 2019Vol:13 No:07 2019Vol:13 No:06 2019Vol:13 No:05 2019Vol:13 No:04 2019Vol:13 No:03 2019Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007