Entropic priors for short-term stochastic process classification

Lack of knowledge of the prior probabilities in Bayesian process classifications from short sequences, may make temporary inferences unstable, or difficult to interpret. In some time-critical applications the use of uniform priors may be just too strong, or unjustified. A promising approach to “objective” prior determination is the application of the principle of maximum entropy to the model. The resulting so-called entropic priors [5], are applied here to Bayesian process classification with inferences based only on likelihood knowledge. We address the posterior consistency problem and derive a condition for ergodicity. The result is applied here to the classification of Gaussian processes. Some typical simulations of classification of AR processes are included.

[1]  David G. Stork,et al.  Pattern Classification , 1973 .

[2]  J. Bernardo Reference Posterior Distributions for Bayesian Inference , 1979 .

[3]  H. Raiffa,et al.  Applied Statistical Decision Theory. , 1961 .

[4]  Philippe Smets,et al.  Decision making in the TBM: the necessity of the pignistic transformation , 2005, Int. J. Approx. Reason..

[5]  Ariel Caticha,et al.  Entropic Inference , 2010, 1011.0723.

[6]  H. Jeffreys An invariant form for the prior probability in estimation problems , 1946, Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences.

[7]  Howard Raiffa,et al.  Applied Statistical Decision Theory. , 1961 .

[8]  Philippe Smets,et al.  Belief functions: The disjunctive rule of combination and the generalized Bayesian theorem , 1993, Int. J. Approx. Reason..

[9]  Francesco Palmieri,et al.  Objective priors from maximum entropy in data classification , 2013, Inf. Fusion.

[10]  P. Smets,et al.  Target classification approach based on the belief function theory , 2005, IEEE Transactions on Aerospace and Electronic Systems.

[11]  M. Tribus,et al.  Probability theory: the logic of science , 2003 .

[12]  Arnold Zellner,et al.  Models, prior information, and Bayesian analysis☆ , 1996 .

[13]  Ariel Caticha Maximum entropy, fluctuations and priors , 2001 .

[14]  Ariel Caticha,et al.  Updating Probabilities with Data and Moments , 2007, ArXiv.

[15]  V. Majerník Marginal probability distribution determined by the maximum entropy method , 2000 .

[16]  J. Bernardo,et al.  THE FORMAL DEFINITION OF REFERENCE PRIORS , 2009, 0904.0156.

[17]  Francesco Palmieri,et al.  Data Fusion with Entropic Priors , 2010, WIRN.

[18]  Howard Raiffa,et al.  Applied Statistical Decision Theory. , 1961 .

[19]  R. Preuss,et al.  Maximum entropy and Bayesian data analysis: Entropic prior distributions. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[20]  V. Novák,et al.  Mathematical Principles of Fuzzy Logic , 1999 .

[21]  Thomas M. Cover,et al.  Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing) , 2006 .

[22]  Glenn Shafer,et al.  A Mathematical Theory of Evidence , 2020, A Mathematical Theory of Evidence.

[23]  Tilman Neumann Bayesian Inference Featuring Entropic Priors , 2007 .

[24]  John G. van Bosse,et al.  Wiley Series in Telecommunications and Signal Processing , 2006 .

[25]  M.G. Bellanger,et al.  Digital processing of speech signals , 1980, Proceedings of the IEEE.