Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 31181


Select areas to restrict search in scientific publication database:
10011862
Bayesian Deep Learning Algorithms for Classifying COVID-19 Images
Authors:
Abstract:
The study investigates the accuracy and loss of deep learning algorithms with the set of coronavirus (COVID-19) images dataset by comparing Bayesian convolutional neural network and traditional convolutional neural network in low dimensional dataset. 50 sets of X-ray images out of which 25 were COVID-19 and the remaining 20 were normal, twenty images were set as training while five were set as validation that were used to ascertained the accuracy of the model. The study found out that Bayesian convolution neural network outperformed conventional neural network at low dimensional dataset that could have exhibited under fitting. The study therefore recommended Bayesian Convolutional neural network (BCNN) for android apps in computer vision for image detection.
Digital Object Identifier (DOI):

References:

[1] A. Martín, A. Ashish, B. Paul, B. Eugene, C. Chifeng, C. Craig, S.C. Greg, D. Andy, D. Jeffrey, D. Matthieu, G. Sanjay, G. Lan, H. Andrew, L. Geoffrey, L. Michael, J. Rafal, J. Yangqing, K. Lukasz, K. Manjunath, L. Josh, M. Dan, S. Mike, M. Rajat, M. Sherry, M. Derek, O. Chris, S. Jonathon, S. Benoit, S. Ilya, T. Kunal, T. Paul, V. Vincent, V. Vijay, V. Fernanda, V. Oriol, W. Pete, W. Martin, Y. Yuan, and Z. Xiaoqiang, ‘TensorFlow: Large-scale machine learning on heterogeneous systems’. Software available from tensorflow.org, 2015.
[2] G. Yarin, I. Riashat and G. Zoubin, ‘Deep Bayesian Active Learning with Image Data’, Workshop on Bayesian Deep Learning, Neural Information Processing Systems, Barcelona, Spain, 2016.
[3] J. Schmidhuber, ‘Deep learning in neural networks: “An overview’, Neural Networks, Vol 61, pp 85-117, 2015.
[4] Y. Li. and Y. Liang, ‘Learning over parameterized neural networks via stochastic gradient descent on structured data,’ inadvances in Neural Information Processing Systems (NeurIPS), 2018.
[5] A. Kendall and Y. Gal, ‘What uncertainties Do We need in Bayesian deep learning for computer vision?,’ in Advances in Neural Information Processing Systems (NIPS), 2017.
[6] H. Jonathan and K. Nal, ‘Bayesian Inference for Large Scale Image Classification’, arXiv:1908.03491v1 (cs.LG), 2019.
[7] D. Giacomo, B. Christopher and Z. Xian, ‘Bayesian Neural Networks for Cellular Image Classification and Uncertainty Analysis’, bioRxiv preprint doi: https://doi.org/10.1101/824862, 2020.
[8] T. Toan, P. Trung, C. Gustavo, P. Lyle and R. Lan, ‘A Bayesian Data Augmentation Approach for Learning Deep Models’, 31st Conference on Neural Information Processing Systems, Long Beach, CA, USA, 2017.
[9] F.K. Chen, R. D. Viljoen, and D.M. Bukowska,‘Classification of image artefacts in optical coherence tomography angiography of the choroid in macular diseases’. Clinical & Experimental Ophthalmology, 44(5), 388–399. doi:10.1111/ceo.12683 2015.
[10] M. Krasser, ‘Variational inference in Bayesian neural networks’. http://krasserm.github.io/2019/03/14/bayesian-neural-networks/ unpublished, 2019.
[11] M. Takashi, ‘Bayesian deep learning: A model-based interpretable approach nonlinear Theory and Its Applications’, IEICE, vol. 11, no. 1, pp. 16–35 c_ IEICE 2020 DOI: 10.1587/nolta.11.16
[12] R. G. Van and F. L. Drake Jr. Python reference manual. Centrum voor Wiskunde en Informatica Amsterdam. 1995
[13] Anaconda Software Distribution. Anaconda Documentation. Anaconda Inc. Retrieved from https://docs.anaconda.com/2020.
Vol:15 No:04 2021Vol:15 No:03 2021Vol:15 No:02 2021Vol:15 No:01 2021
Vol:14 No:12 2020Vol:14 No:11 2020Vol:14 No:10 2020Vol:14 No:09 2020Vol:14 No:08 2020Vol:14 No:07 2020Vol:14 No:06 2020Vol:14 No:05 2020Vol:14 No:04 2020Vol:14 No:03 2020Vol:14 No:02 2020Vol:14 No:01 2020
Vol:13 No:12 2019Vol:13 No:11 2019Vol:13 No:10 2019Vol:13 No:09 2019Vol:13 No:08 2019Vol:13 No:07 2019Vol:13 No:06 2019Vol:13 No:05 2019Vol:13 No:04 2019Vol:13 No:03 2019Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007