Bayesian statistics has a lot of influence on neural networks and deep learning for artificial intelligence (AI). The inference and learning of Bayesian statistics is based on prior, likelihood and posterior. The prior is the current belief of data field and the posterior is the updated belief after learning from observed data. By repeated learning using prior and posterior distributions, Bayesian statistics provides advanced data learning for AI. In this paper, we compare the previous Bayesian inference and learning methods for AI and propose a model based on Bayesian inference and learning for neural networks and deep learning.
Bayesian and frequentist are two main approaches in statistics [1,2]. The difference between two approaches is the use of prior information about the domain to which the data belongs . Bayesian statistics uses the prior distribution as probability distribution of model parameter. In addition, the prior distribution combines with the likelihood function representing observed data to produce posterior distribution. When new data is added, the current posterior distribution is used as the prior distribution and updated as new posterior distribution by combining with the likelihood function based on the new data. The updating procedure of this posterior distribution enables the model to be learned from new data. This procedure improves the intelligence of the model for data domain. So, Bayesian learning provides an efficient model for artificial intelligence (AI). In previous studies, the Bayesian AI was dependent on the Bayesian networks . The Bayesian networks model is a graphical model representing the relations between random variables under uncertainty. This model consists of nodes (random variables) and arcs. The node and arc represent variable and connection between variables respectively. The connections show the causal relations between the nodes. The strength of the connection is a probability representing the belief connection. This belief is also updated by new observed data. So, this model is a popular approach to AI. In this paper, we study on a method for AI using Bayesian inference and learning.
We build the posterior distribution by combining the likelihood function with new probability distribution with prior information for Bayesian learning model. To illustrate the validity of our proposed approach, we present experimental results using simulation and existing data.
In this paper, we showed the possibility of Bayesian inference and learning in neural networks and deep learning. We confirmed the advantages of Bayesian neural networks that provides better performance compared to the deep learning in our experimental result. In classification, the deep learning has dominated other machine learning algorithms. But, in regression problem, the Bayesian neural networks model provides competitive result over deep learning in this paper. The Bayesian neural networks model is constructed by learning the network weights using the parameters of the posterior distribution, which are updated by the prior distribution and likelihood function. However, it can be seen that the computing time of Bayesian neural networks is more than that of existing neural network models.
But, in Bayesian neural networks, we have to carry out the Markov chain simulation for constructing and sampling from the target posterior distribution. This process requires a lot of computation time. As the data size increases, the computation time of the Bayesian neural networks increases. So, in our future works, we will study on the new method of Bayesian learning to reduce the computing time of Bayesian neural networks. We will add a new distribution to Bayesian learning in addition to prior distribution, likelihood function, and posterior distribution for a faster Bayesian update process. In addition, we will study on the thinking machine like human using Bayesian learning for neural networks.
 S. Theodoridis, Machine Learning A Bayesian and Optimization Perspective. London UK: Elsevier, 2015.
 S. Jun, “Frequentist and Bayesian Learning Approaches to Artificial Intelligence,” International Journal of Fuzzy logic and Intelligent Systems, vol. 16, no. 2, pp. 111–118, 2016.
 A. Gelman, J. B. Carlin, H. S. Stern, D. B. Dunson, A. Vehtari, and D. B. Rubin, D. B. Bayesian Data Analysis, Third Edition. Boca Raton, FL: Chapman & Hall/CRC Press, 2013.
 K. B. Korb, and A. E. Nicholson, Bayesian artificial intelligence, second edition. London, UK: CRC press, 2011.
 T. M. Donovan, and R. M. Mickey, Bayesian Statistics for Beginners, Oxford, UK: Oxford University Press, 2019.
 R. M. Neal, Bayesian learning for neural networks. New York: Springer, 1996.
 H. M. Koduvely, Learning Bayesian Models with R, Birmingham, UK, Packt, 2015.
 The Boston house-price data, http://lib.stat.cmu.edu/datasets/boston, 2019.