Optimized Realization of Bayesian Networks in Reduced Normal Form using Latent Variable Model

Bayesian networks in their Factor Graph Reduced Normal Form (FGrn) are a powerful paradigm for implementing inference graphs. Unfortunately, the computational and memory costs of these networks may be considerable, even for relatively small networks, and this is one of the main reasons why these structures have often been underused in practice. In this work, through a detailed algorithmic and structural analysis, various solutions for cost reduction are proposed. An online version of the classic batch learning algorithm is also analyzed, showing very similar results (in an unsupervised context); which is essential even if multilevel structures are to be built. The solutions proposed, together with the possible online learning algorithm, are included in a C++ library that is quite efficient, especially if compared to the direct use of the well-known sum-product and Maximum Likelihood (ML) algorithms. The results are discussed with particular reference to a Latent Variable Model (LVM) structure.

[1]  H.-A. Loeliger,et al.  An introduction to factor graphs , 2004, IEEE Signal Processing Magazine.

[2]  Nir Friedman,et al.  Probabilistic Graphical Models - Principles and Techniques , 2009 .

[3]  Francesco Palmieri,et al.  A Comparison of Algorithms for Learning Hidden Variables in Bayesian Factor Graphs in Reduced Normal Form , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[4]  M. Elter,et al.  The prediction of breast cancer biopsy outcomes using two CAD approaches that both emphasize an intelligible decision process. , 2007, Medical physics.

[5]  Francesco Palmieri,et al.  Belief propagation and learning in convolution multi-layer factor graphs , 2014, 2014 4th International Workshop on Cognitive Information Processing (CIP).

[6]  David Barber,et al.  Bayesian reasoning and machine learning , 2012 .

[7]  Peter C. Cheeseman,et al.  Bayesian Classification (AutoClass): Theory and Results , 1996, Advances in Knowledge Discovery and Data Mining.

[8]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[9]  Francesco Palmieri,et al.  Towards Building Deep Networks with Bayesian Factor Graphs , 2015, ArXiv.

[10]  Francesco Palmieri,et al.  Discrete independent component analysis (DICA) with belief propagation , 2015, 2015 IEEE 25th International Workshop on Machine Learning for Signal Processing (MLSP).

[11]  Francesco Palmieri,et al.  Simulink Implementation of Belief Propagation in Normal Factor Graphs , 2015, Advances in Neural Networks.

[12]  Francesco Palmieri,et al.  COMPUTATIONAL OPTIMIZATION FOR NORMAL FORM REALIZATION OF BAYESIAN MODEL GRAPHS , 2018, 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP).

[13]  Francesco Palmieri,et al.  Two-dimensional multi-layer Factor Graphs in Reduced Normal Form , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[14]  M. Veloso,et al.  Latent Variable Models , 2019, Statistical and Econometric Methods for Transportation Data Analysis.

[15]  G.D. Forney,et al.  Codes on graphs: Normal realizations , 2000, IEEE Trans. Inf. Theory.