Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 30075


Select areas to restrict search in scientific publication database:
10010106
An IM-COH Algorithm Neural Network Optimization with Cuckoo Search Algorithm for Time Series Samples
Abstract:
Back propagation algorithm (BP) is a widely used technique in artificial neural network and has been used as a tool for solving the time series problems, such as decreasing training time, maximizing the ability to fall into local minima, and optimizing sensitivity of the initial weights and bias. This paper proposes an improvement of a BP technique which is called IM-COH algorithm (IM-COH). By combining IM-COH algorithm with cuckoo search algorithm (CS), the result is cuckoo search improved control output hidden layer algorithm (CS-IM-COH). This new algorithm has a better ability in optimizing sensitivity of the initial weights and bias than the original BP algorithm. In this research, the algorithm of CS-IM-COH is compared with the original BP, the IM-COH, and the original BP with CS (CS-BP). Furthermore, the selected benchmarks, four time series samples, are shown in this research for illustration. The research shows that the CS-IM-COH algorithm give the best forecasting results compared with the selected samples.
Digital Object Identifier (DOI):

References:

[1] W. S. Mcculloch and W. Pitts, A logical calculus of the ideas immanent in Nervous activity, Bulletin of Mothemnticnl Biology19, 90, 52, 99-115.
[2] S. E. Fahlman and C. Lebiere, The cascade-correlation learning architecture, Advances in Neural Information Processing. 1990, 2, 524532.
[3] G.Villarrubia, F. D. P. Juan, P. Chamoso and D. l. P. Fernando, Artificial neural networks used in optimization problems. Neurocomputing, 2017, 1-7, https://doi.org/10.1016/j.neucom.2017.04.075.
[4] S. Selva, K. Emin, K. Seyit, K. Hatem and G. Elcin, Artificial Neural Network and Agility, Procedia - Social and Behavioral Sciences, 2015, 195, 1477-1485, https://doi.org/10.1016/j.sbspro.2015.06.448.
[5] A. Wam, S. Esm, and A. Esa, Modified Back Propagation Algorithm for Learning Artificial Neural Networks, Eighteenth National Radio Science Conference (NRSC), 2001, 345-352.
[6] R. Law, Back-propagation learning in improving the accuracy of neural network-based tourism demand forecasting, Tourism Management, 2000, 21, 331340.
[7] Z. Yu-Rong, Z. Yi, Ch. Beomjin, and W. Lin, Multifactor-influenced energy consumption forecasting using enhanced back-propagation neural network, Energy. 2017, 127, 381-396, https://doi.org/10.1016/j.energy.2017.03.094.
[8] D. Shounak, S. M. Sankha and D. Swagatam, Generalized mean based back-propagation of errors for ambiguity resolution, Pattern Recognition Letters, 2017, 94, 2229, https://doi.org/10.1016/j.patrec.2017.04.019.
[9] A. H. Alaa, K. Bekir and Sh. S. Mohammad, Back-propagation algorithm with variable adaptive momentum, Knowledge-Based Systems, 2017, 114, 79-87, https://doi.org/10.1016/j.knosys.2016.10.001.
[10] V. Ramana, P. Sunthar and R. Ch. Durgaprasada, The generalized proportional-integral-derivative (PID) gradient descent back propagation algorithm, Neural Networks, 1995, 8, 563-569, https://doi.org/10.1016/0893-6080(94)00100-Z.
[11] N. Bighnaraj, N. Janmenjoy and S. B. Himansu, A Global-best Harmony Search based Gradient Descent Learning FLANN (GbHS-GDL-FLANN) for data classification, Egyptian Informatics Journal, 2016, 17, 57-87, https://doi.org/10.1016/j.eij.2015.09.001.
[12] G. K. Bahram, S. Sch. Susan, H. Troy Nagle, Performance of the LevenbergMarquardt neural network training method in electronic nose applications, Sensors and Actuators B: Chemical, 2005, 110, 13-22, https://doi.org/10.1016/j.snb.2005.01.008.
[13] A. Shahrokh, H. Esmaeil, M. Farhad and N. M. Masoud, Hybridization of evolutionary LevenbergMarquardt neural networks and data pre-processing for stock market prediction, Knowledge-Based Systems, 2012, 35, 245-258, https://doi.org/10.1016/j.knosys.2012.05.003.
[14] R. Mendes, P. Cortez, M. Rocha and J. Neves, Particle swarm for feed forward neural network training, Proceedings of the International Joint Conference on Neural Networks, 2002, 2, 1895-1899.
[15] J. Zhang, T. Lok, M. Lyu, A hybrid particle swarm optimization back propagation algorithm for feed forward neural network training, Journal Applied Mathematics and Computation, 2007, 185, 1026-1037.
[16] H. Shah, R. Ghazali, N. M. Nawi and M. M. Deris, Global hybrid ant bee colony algorithm for training artificial neural network, International Conference on Computational Science and Its Applications, 2012, 7333, 87-100, DOI : 10.1007/978 − 3 − 642 − 31125 − 37.
[17] H. Shah, R. Ghazali and N. M. Nawi, Hybrid ant bee colony algorithm for volcano temperature prediction, Communications in Computer and Information Science, 2012, 281, 453-465, DOI : 10.1007/978 − 3 − 642 − 28962 − 043.
[18] D. Karaboga, B. Basturk and C. Ozturk, Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks, International Conference on Modeling Decisions for Artificial Intelligence, 2007, 4617, 318-319, DOI : 10.1007/978 − 3 − 540 − 73729 − 230.
[19] J. N. D. Gupta and R. S. Sexton, Comparing backpropagation with a genetic algorithm for neural network training, Omega The International Journal of Management Science, 1999, 27, 679-684, https://doi.org/10.1016/S0305-0483(99)00027-4.
[20] A. U. Khan, T. K. Bandopadhyaya and S. Sharma, Comparisons of Stock Rates Prediction Accuracy using Different Technical Indicators with Backpropagation Neural Network and Genetic Algorithm Based Backpropagation Neural Network, Proceedings of the First International Conference on Emerging Trends in Engineering and Technology IEEE Computer Society, 2008, 575-580.
[21] X. S. Yang and S. Deb, Engineering Optimisation by Cuckoo Search, International Journal of Mathematical Modelling and Numerical Optimisation, 2010, 1, 330-343.
[22] M. Tuba, M. Subotic and N. Stanarevic, Modified cuckoo search algorithm for unconstrained optimization problems, Proceedings of the European Computing Conference (ECC 11), 2011, 263-268.
[23] M. Tuba, M. Subotic N. and Stanarevic, Performance of a Modified Cuckoo Search Algorithm for Unconstrained Optimization, WSEAS TRANSACTIONS on SYSTEMS, 2012, 11, 263-268.
[24] M. N. Nazri, K. Abdullah and M. Z. Rehman, A New Levenberg Marquardt based Back Propagation Algorithm Trained with Cuckoo Search, Journal of Procedia Technology. 2013, 11, 18-23.
[25] Y. Jiao-hong, X. Wei-hong and C. Yuan-tao, Novel Back Propagation Optimization by Cuckoo Search Algorithm, Journal of The Scientific World Journal, 2014, 2014, Article ID 878262, http://dx.doi.org/10.1155/2014/878262.
[26] M. N. Nazri, K. Abdullah and M. Z. Rehman, A New Back-Propagation Neural Network Optimized with Cuckoo Search Algorithm, International Conference on Computational Science and Its Applications, 2013, 7971, 413-426.
[27] W. Wongsinlatam, Manipulation of hidden layers to improve the generalization ability of neural networks, AIP Conference Proceedings, 2016, 1705, 0200201-0200208, http://dx.doi.org/10.1063/1.4940268.
[28] H. Yonaba, F. Anctil and V. Fortin, Comparing sigmoid transfer functions for neural network multistep ahead streamflow forecasting, Journal of Hydrologic Engineering, 2010, 15, 275283.
[29] W. Wongsinlatam and S. Buchitchon, The Comparison between Dragonflies Algorithm and Fireflies Algorithm for Court Case Administration: A Mixed Integer Linear Programming, Journal of Physics: Conference Series, 2018, 1061(1), 012005.
[30] S. Kulkarni and I. Haidar, Forecasting model for crude oil price using artificial neural networks and commodity futures prices, International Journal of Computer Science and Information Security, 2009, 2, 18.
Vol:13 No:10 2019Vol:13 No:09 2019Vol:13 No:08 2019Vol:13 No:07 2019Vol:13 No:06 2019Vol:13 No:05 2019Vol:13 No:04 2019Vol:13 No:03 2019Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007