Violating the Ingleton inequality with finite groups

It is well known that there is a one-to-one correspondence between the entropy vector of a collection of n random variables and a certain group-characterizable vector obtained from a finite group and n of its subgroups [1]. However, if one restricts attention to abelian groups then not all entropy vectors can be obtained. This is an explanation for the fact shown by Dougherty et al [2] that linear network codes cannot achieve capacity in general network coding problems (since linear network codes form an abelian group). All abelian group-characterizable vectors, and by fiat all entropy vectors generated by linear network codes, satisfy a linear inequality called the Ingleton inequality. In this paper, we study the problem of finding non-abelian finite groups that yield characterizable vectors which violate the Ingleton inequality. Using a refined computer search, we find the symmetric group S5 to be the smallest group that violates the Ingleton inequality. Careful study of the structure of this group, and its subgroups, reveals that it belongs to the Ingleton-violating family PGL(2, p) with primes p ≥ 5, i.e., the projective group of 2×2 nonsingular matrices with entries in Fp. This family of groups is therefore a good candidate for constructing network codes more powerful than linear network codes.

[1]  Hua Li,et al.  On Connections between Group Homomorphisms and the Ingleton Inequality , 2007, 2007 IEEE International Symposium on Information Theory.

[2]  Milan Studený,et al.  Conditional Independences among Four Random Variables 1 , 1995, Comb. Probab. Comput..

[3]  Frantisek Matús,et al.  Conditional Independences among Four Random Variables II , 1995, Combinatorics, Probability and Computing.

[4]  Terence Chan,et al.  Group characterizable entropy functions , 2007, 2007 IEEE International Symposium on Information Theory.

[5]  F. Mattt,et al.  Conditional Independences among Four Random Variables Iii: Final Conclusion , 1999 .

[6]  Zhen Zhang,et al.  The Capacity Region for Multi-source Multi-sink Network Coding , 2007, 2007 IEEE International Symposium on Information Theory.

[7]  M. Lunelli,et al.  Representation of matroids , 2002, math/0202294.

[8]  Nikolai K. Vereshchagin,et al.  Inequalities for Shannon Entropy and Kolmogorov Complexity , 1997, J. Comput. Syst. Sci..

[9]  Ho-Leung Chan,et al.  A combinatorial approach to information inequalities , 1999, 1999 Information Theory and Networking Workshop (Cat. No.99EX371).

[10]  D. L. Johnson Presentations of groups , 1976 .

[11]  Randall Dougherty,et al.  Insufficiency of linear coding in network information flow , 2005, IEEE Transactions on Information Theory.

[12]  Randall Dougherty,et al.  Networks, Matroids, and Non-Shannon Information Inequalities , 2007, IEEE Transactions on Information Theory.

[13]  Babak Hassibi,et al.  Normalized Entropy Vectors, Network Information Theory and Convex Optimization , 2007, 2007 IEEE Information Theory Workshop on Information Theory for Wireless Networks.

[14]  Raymond W. Yeung,et al.  On a relation between information inequalities and group theory , 2002, IEEE Trans. Inf. Theory.

[15]  Zhen Zhang,et al.  On Characterization of Entropy Function via Information Inequalities , 1998, IEEE Trans. Inf. Theory.

[16]  B. Hassibi,et al.  Cayley's hyperdeterminant, the principal minors of a symmetric matrix and the entropy region of 4 Gaussian random variables , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.