Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 31181


Select areas to restrict search in scientific publication database:
10011791
Facial Emotion Recognition with Convolutional Neural Network Based Architecture
Abstract:
Neural networks are appealing for many applications since they are able to learn complex non-linear relationships between input and output data. As the number of neurons and layers in a neural network increase, it is possible to represent more complex relationships with automatically extracted features. Nowadays Deep Neural Networks (DNNs) are widely used in Computer Vision problems such as; classification, object detection, segmentation image editing etc. In this work, Facial Emotion Recognition task is performed by proposed Convolutional Neural Network (CNN)-based DNN architecture using FER2013 Dataset. Moreover, the effects of different hyperparameters (activation function, kernel size, initializer, batch size and network size) are investigated and ablation study results for Pooling Layer, Dropout and Batch Normalization are presented.
Digital Object Identifier (DOI):

References:

[1] A. Mehrabian, A. Communication without words. Psychol. Today, 1968, pp. 53-56.
[2] A. Mehrabian, Silent Massages, 1971, pg.44.
[3] B.C. Ko, "A Brief Review of Facial Emotion Recognition Based on Visual Information," MDPI Sensor Journal, 18(2), 2018, pp. 401.
[4] S. Du, A.M. Martinez, "Compound facial expressions of emotion." Natl. Acad. Sci., 2014, pp. 1454-1462.
[5] B.H. Lee and J.G. Lee, "Therapeutic behavior of robot for treating autistic child using artificial neural network." Fuzzy Systems and Data Mining IV: Proceedings of FSDM, 2018, pp. 358-364.
[6] D. Ghimire, J. Lee, "Geometric feature-based facial expression recognition in image sequences using multi-class AdaBoost and support vector machines," Sensors 13(6), 2013, pp. 7714-7734.
[7] S.L. Happy, A. George, A. Routray, "A real time facial expression classification system using local binary patterns", In Proceedings of the 4th International Conference on Intelligent Human Computer Interaction (IHCI'12), 2012, pp. 1-5.
[8] T. Ojala, M. Pietikainen, and T. Maenpaa, "Multiresolution gray-scale and rotation Invariant texture classification with local binary patterns," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 7, 2012, pp. 971-987.
[9] G. K. Venayagamoorthy, S. Bashyal, "Recognition of facial expressions using Gabor wavelets and learning vector quantization," Engineering Applications of Artificial Intelligence, 2008, pp. 1056–1064.
[10] T. Ahsan, T. Jabid, and U.P. Chong," Facial expression recognition using local transitional pattern on gabor filtered facial images," IETE Technical Review, 30(1), 2013, pp. 47–52.
[11] S. Zhang, X. Zhao, B. Lei, "Robust facial expression recognition via compressive sensing," Sensors, 2012 pp. 3747–3761.
[12] D. Ghimire, S. Jeong, J. Lee, S.H. Park, "Facial expression recognition based on local region-specific features and support vector machines," Tenth International Conference on Digital Information Management (ICDIM'17) Multimed. Tools Appl., 2017, pp. 7803–7821.
[13] C.F. Benitez-Quiroz, R. Srinivasan, A.M. Martinez, "EmotioNet: An accurate, real-time algorithm for the automatic annotation of a million facial expressions in the wild," In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR'16), 2016, pp. 5562–5570.
[14] V. Tumen O. Soylemez, B. Ergen, "Facial Emotion Recognition on a Dataset Using Convolutional Neural Network. International Artificial Intelligence and Data Processing Symposium (IDAP'17)," 2017. DOI: 10.1109/IDAP.2017.8090281
[15] A. Gudi, V. Vision, "Recognizing Semantic Features in Faces using Deep Learning," 2016 arXiv:1512.00743.
[16] V. Salunke, C.G. Patil, "A New Approach for Automatic Face Emotion Recognition," International Conference on Computing, Communication, Control and Automation (ICCUBEA'17), 2017. DOI: 10.1109/ICCUBEA.2017.8463785.
[17] A. Verma, P. Singh, J. Sahaya, R. Alex, "Modified Convolutional Neural Network Architecture Analysis for Facial Emotion Recognition," International Conference on Systems, Signals and Image Processing (IWSSIP' 19), 2019. DOI: 10.1109/IWSSIP.2019.8787215
[18] A. Mollahosseini, D. Chan, M.H. Mahoor, "Going deeper in facial expression recognition using deep neural networks," In Proceedings of the IEEE Winter Conference on Application of Computer Vision, 2019, pp. 1–10.
[19] MMI. Available online: https://mmifacedb.eu/ (accessed on 13 June 2020).
[20] Y. LeCun, L. Bottou, Y. Bengio and P. Haffner, "Gradient-based learning applied to document recognition," Proceedings of the IEEE. 86 (11), 1998, pp. 2278–2324.
[21] M.D. Zeiler and R. Fergus, "Visualizing and Understanding Convolutional Networks," ECCV 2014, Part I, LNCS 8689, 2014 pp. 818–833.
[22] M.J. Lyons, S. Akamatsu, M. Kamachi, J. Gyoba, "Coding facial expressions with Gabor wave," In Proceedings of the IEEE International Conference on Automatic Face and Gesture Recognition, 1998, pp. 200–205.
[23] KDEF Available online: http://www.emotionlab.se/resources/kdef (accessed on 13 June 2020).
[24] O. Langner, R. Dotsch, G. Bijlstra, D.H. Wigboldus, S.T. Hawk and A.van Knippenberg, "Presentation and validation of the radboud faces database," Cognition and emotion, 24(8), 2010, pp.1377-1388.
[25] P. Lucey, J.F. Cohn, T. Kanade, J. Saragih, Z. Ambadar, I. Matthews, "The extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression," In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW'10), 2010, pp. 94–101.
[26] MMI. Available online: https://mmifacedb.eu/ (accessed on 13 June 2020).
[27] Kaggle. Challenges in representation learning: Facial expression recognition challenge, 2013
[28] E. Correa, A. Jonker, M.Ozo, R. Stolk, "Emotion Recognition using Deep Convolutional Neural Networks," 2016.
[29] H.W. Ng, V.D. Nguyen, V. Vonikakis and S. Winkler, "Deep Learning for Emotion Recognition on Small Datasets Using Transfer Learning" ACM International Conference on Multimodal Interaction (ICMI'15), 2016.
[30] P. Viola and M. Jones, "Rapid Object Detection using a Boosted Cascade of Simple Features," Conference on Computer Vision and Pattern Recognition (CVRP), 2001. DOI: 10.1109/CVPR.2001.990517
[31] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, R. Salakhutdinov, "Dropout: A Simple Way to Prevent Neural Networks from Overfitting," Journal of Machine Learning Research 15, 2015, pp.1929-1958.
[32] S. Ioffe and C.Szegedy, "Batch normalization: Accelerating deep network training by reducing internal covariate shift," 2015. arXiv:1502.03167.
[33] X. Glorot, Y. Bengio, "Understanding the difficulty of training deep feedforward neural networks," Conference on Artificial Intelligence and Statistics (AISTATS’10). Volume 9, 2010, pp. 249-256.
[34] K. He, X. Zhang, S. Ren, J. Sun, "Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification" 2015. arXiv:1502.01852
[35] A. Saxe, J.L. McClelland, S. Ganguli, "Exact solutions to the nonlinear dynamics of learning in deep linear neural networks," The International Conference on Learning Representations (ICLR14), 2014. arXiv:1312.6120
[36] I.J. Goodfellow, D. Erhan, P.L. Carrier, A. Courville, M. Mirza, B. Hamner, W. Cukierski, Y. Tang, D. Thaler, D.H. Lee, Y. Zhou, C. Ramaiah, F. Feng, R. Li, X. Wang, D. Athanasakis, J. Shawe-Taylor, M. Milakov, J. Park, R. Ionescu, M. Popescu, C. Grozea, J. Bergstra, J. Xie, L. Romaszko, B. Xu, Z. Chuang and Y. Bengio, "Challenges in Representation Learning: A report on three machine learning contests" ICONIP 2013, Part III, LNCS 8228, 2013, pp. 117–124.
Vol:15 No:04 2021Vol:15 No:03 2021Vol:15 No:02 2021Vol:15 No:01 2021
Vol:14 No:12 2020Vol:14 No:11 2020Vol:14 No:10 2020Vol:14 No:09 2020Vol:14 No:08 2020Vol:14 No:07 2020Vol:14 No:06 2020Vol:14 No:05 2020Vol:14 No:04 2020Vol:14 No:03 2020Vol:14 No:02 2020Vol:14 No:01 2020
Vol:13 No:12 2019Vol:13 No:11 2019Vol:13 No:10 2019Vol:13 No:09 2019Vol:13 No:08 2019Vol:13 No:07 2019Vol:13 No:06 2019Vol:13 No:05 2019Vol:13 No:04 2019Vol:13 No:03 2019Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007