Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 30067


Select areas to restrict search in scientific publication database:
10010760
A Recognition Method of Ancient Yi Script Based on Deep Learning
Abstract:
Yi is an ethnic group mainly living in mainland China, with its own spoken and written language systems, after development of thousands of years. Ancient Yi is one of the six ancient languages in the world, which keeps a record of the history of the Yi people and offers documents valuable for research into human civilization. Recognition of the characters in ancient Yi helps to transform the documents into an electronic form, making their storage and spreading convenient. Due to historical and regional limitations, research on recognition of ancient characters is still inadequate. Thus, deep learning technology was applied to the recognition of such characters. Five models were developed on the basis of the four-layer convolutional neural network (CNN). Alpha-Beta divergence was taken as a penalty term to re-encode output neurons of the five models. Two fully connected layers fulfilled the compression of the features. Finally, at the softmax layer, the orthographic features of ancient Yi characters were re-evaluated, their probability distributions were obtained, and characters with features of the highest probability were recognized. Tests conducted show that the method has achieved higher precision compared with the traditional CNN model for handwriting recognition of the ancient Yi.
Digital Object Identifier (DOI):

References:

[1] C.X Zhu, “The Study of the Yi Language Ancient Document”. Nationalities Publishing House,2008.
[2] J Gao, J.Z Liu, “Problems and Countermeasures of the Digitalization of Ancient Books in Mainland China”.Journal of Library Science In China, vol. 2013, pp. 110-119, 2013.
[3] J,W Wang, Y.L Wen ,Y.Q Li ,Y.L Gao, “The Recognition System of Old-Yi Character Based on the Image Segmentation”. Journal of Yunnan Nationalities University: Natural Sciences Edition, vol. 17, pp. 76-79, 2008
[4] L.H Zhu, J.W Wang, “Off-Line Handwritten Yi Character Recognition Based on the Multi-Classifier Ensemble with Combination Features”. Journal of Yunnan Nationalities University: Natural Sciences Edition, vol. 19, pp. 329-333, 2010.
[5] Z.G Zhu, X.L Wu, “Principles and Implementation of an Off-Line Printed Yi Character Recognition System”. Computer Technology and Development, vol. 22, pp. 85-88, 2012.
[6] S Liu, Y.D Li, “Design and Realization on Character Segmentation Method for Yi Language”. Journal of South-Central University for Nationalities: Natural Sciences Edition,vol. 26, pp. 74-76, 2007.
[7] B Wu, Analysis of Yi Characters Based on Character Recognition. Journal of Southwest Minzu University (Humanities and Social Science), pp. 47-53, 2018.
[8] Yi collaboration group of Yunnan, Sichuan, Guizhou and Guangxi in China, “A Collection of Yi Characters in Yunnan and Sichuan”. Yunnan nationalities press,2004:251-273
[9] X. H. Ren, Y. Zhou, J. H. He, K. Chen, X. K. Yang, and J. Sun, "A Convolutional Neural Network-Based Chinese Text Detection Algorithm via Text Structure Modeling," IEEE Transactions on Multimedia, vol. 19, pp. 506-518, Mar 2017.
[10] M. A. H. Akhand, M. Ahmed, M. M. H. Rahman, M. M. Islam, "Convolutional Neural Network Training incorporating Rotation-Based Generated Patterns and Handwritten Numeral Recognition of Major Indian Scripts," IETE Journal of Research, vol. 64, pp. 176-194, 2018.
[11] A. Nasee, K. Zafar, "Comparative Analysis of Raw Images and Meta Feature based Urdu OCR using CNN and LSTM," International Journal of Advanced Computer Science and Applications, vol. 9, pp. 419-424, Jan 2018.
[12] V. A. Sindagi and V. M. Patel, "A survey of recent advances in CNN-based single image crowd counting and density estimation," Pattern Recognition Letters, vol. 107, pp. 3-16, May 1 2018.
[13] X. M. Deng, Y. D. Zhang, S. Yang, P. Tan, L. Chang, Y. Yuan, and H. A. Wang, "Joint Hand Detection and Rotation Estimation Using CNN," Ieee Transactions on Image Processing, vol. 27, pp. 1888-1900, Apr 2018.
[14] C. A and A. S, " Families of Alpha- Beta- and Gamma- Divergences: Flexible and Robust Measures of Similarities," Entropy, vol. 12, pp. 1532-1568, 2010.
[15] C. A, C. S, and A. S, "Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization," Entropy, vol. 13, pp. 134-170, 2011.
[16] W. W. Shi, Y. H. Gong, X. Y. Tao, and N. N. Zheng, "Training DCNN by Combining Max-Margin, Max-Correlation Objectives, and Correntropy Loss for Multilabel Image Classification," Ieee Transactions on Neural Networks and Learning Systems, vol. 29, pp. 2896-2908, Jul 2018.
[17] A. Sengupta, Y. Shim, K. Roy, "Proposal for an All-Spin Artificial Neural Network: Emulating Neural and Synaptic Functionalities Through Domain Wall Motion in Ferromagnets," Ieee Transactions on Biomedical Circuits and Systems, vol. 10, pp. 1152-1160, Dec 2016.
[18] P. Knag, J. K. Kim, T. Chen, and Z. Y. Zhang, "A Sparse Coding Neural Network ASIC With On-Chip Learning for Feature Extraction and Encoding," Ieee Journal of Solid-State Circuits, vol. 50, pp. 1070-1079, Apr 2015.
[19] S. Qian, H. Liu, C. Liu, S. Wu, and H. S. Wong, "Adaptive activation functions in convolutional neural networks," Neurocomputing, vol. 272, pp. 204-212, Jan 10 2018.
[20] A. Arcos-Garcia, J. A. Alvarez-Garcia, and L. M. Soria-Morillo, "Deep neural network for traffic sign recognition systems: An analysis of spatial transformers and stochastic optimisation methods," Neural Networks, vol. 99, pp. 158-165, Mar 2018.
[21] K. Gopalakrishnan, S. K. Khaitan, A. Choudhary, and A. Agrawal, "Deep Convolutional Neural Networks with transfer learning for computer vision-based data-driven pavement distress detection," Construction and Building Materials, vol. 157, pp. 322-330, Dec 30 2017.
[22] Research Institute of Yi Nationality Studies in Guizhou Province, “Southwest Yi Zhi”. The Nationalities Publishing House of Guizhou, 2015.
Vol:13 No:10 2019Vol:13 No:09 2019Vol:13 No:08 2019Vol:13 No:07 2019Vol:13 No:06 2019Vol:13 No:05 2019Vol:13 No:04 2019Vol:13 No:03 2019Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007