On Ingleton-Violating Finite Groups

Given <inline-formula> <tex-math notation="LaTeX">$n$ </tex-math></inline-formula> discrete random variables, its entropy vector is the <inline-formula> <tex-math notation="LaTeX">$2^{n}-1$ </tex-math></inline-formula>-dimensional vector obtained from the joint entropies of all non-empty subsets of the random variables. It is well known that there is a close relation between such an entropy vector and a certain group-characterizable vector obtained from a finite group and <inline-formula> <tex-math notation="LaTeX">$n$ </tex-math></inline-formula> of its subgroups; indeed, roughly speaking, knowing the region of all such group-characterizable vectors is equivalent to knowing the region of all entropy vectors. This correspondence may be useful for characterizing the space of entropic vectors and for designing network codes. If one restricts attention to abelian groups then not all entropy vectors can be obtained. This is an explanation for the fact shown by Dougherty <italic>et al.</italic> that linear network codes cannot achieve capacity in general network coding problems (since linear network codes come from abelian groups). All abelian group-characterizable vectors, and by fiat all entropy vectors generated by linear network codes, satisfy a linear inequality called the Ingleton inequality. General entropy vectors, however, do not necessarily have this property. It is, therefore, of interest to identify groups that violate the Ingleton inequality. In this paper, we study the problem of finding nonabelian finite groups that yield characterizable vectors, which violate the Ingleton inequality. Using a refined computer search, we find the symmetric group <inline-formula> <tex-math notation="LaTeX">$S_{5}$ </tex-math></inline-formula> to be the smallest group that violates the Ingleton inequality. Careful study of the structure of this group, and its subgroups, reveals that it belongs to the Ingleton-violating family <inline-formula> <tex-math notation="LaTeX">$PGL(2,q)$ </tex-math></inline-formula> with a prime power <inline-formula> <tex-math notation="LaTeX">$q \geq 5$ </tex-math></inline-formula>, i.e., the projective group of <inline-formula> <tex-math notation="LaTeX">$2\times 2$ </tex-math></inline-formula> nonsingular matrices with entries in <inline-formula> <tex-math notation="LaTeX">$ \mathbb {F}_{q}$ </tex-math></inline-formula>. We further interpret this family of groups, and their subgroups, using the theory of group actions and identify the subgroups as certain stabilizers. We also extend the construction to more general groups such as <inline-formula> <tex-math notation="LaTeX">$PGL(n,q)$ </tex-math></inline-formula> and <inline-formula> <tex-math notation="LaTeX">$GL(n,q)$ </tex-math></inline-formula>. The families of groups identified here are therefore good candidates for constructing network codes more powerful than linear network codes, and we discuss some considerations for constructing such group network codes.

[1]  Randall Dougherty,et al.  Non-Shannon Information Inequalities in Four Random Variables , 2011, ArXiv.

[2]  Wei Mao Information-Theoretic Studies and Capacity Bounds: Group Network Codes and Energy Harvesting Communication Systems , 2015 .

[3]  Randall Dougherty,et al.  Linear rank inequalities on five or more variables , 2009, ArXiv.

[4]  Randall Dougherty Computations of linear rank inequalities on six variables , 2014, 2014 IEEE International Symposium on Information Theory.

[5]  Zhen Zhang,et al.  On Characterization of Entropy Function via Information Inequalities , 1998, IEEE Trans. Inf. Theory.

[6]  H. O. Foulkes Abstract Algebra , 1967, Nature.

[7]  Zhen Zhang,et al.  The Capacity Region for Multi-source Multi-sink Network Coding , 2007, 2007 IEEE International Symposium on Information Theory.

[8]  Nigel Boston,et al.  Large violations of the Ingleton inequality , 2012, 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[9]  S. Shadbakht Entropy Region and Network Information Theory , 2011 .

[10]  Guido Zappa,et al.  Partitions and other coverings of finite groups , 2003 .

[11]  F. Mattt,et al.  Conditional Independences among Four Random Variables Iii: Final Conclusion , 1999 .

[12]  Babak Hassibi,et al.  Normalized Entropy Vectors, Network Information Theory and Convex Optimization , 2007, 2007 IEEE Information Theory Workshop on Information Theory for Wireless Networks.

[13]  Ho-Leung Chan,et al.  A combinatorial approach to information inequalities , 1999, 1999 Information Theory and Networking Workshop (Cat. No.99EX371).

[14]  Raymond W. Yeung,et al.  On a relation between information inequalities and group theory , 2002, IEEE Trans. Inf. Theory.

[15]  Alex J. Grant,et al.  Truncation Technique for Characterizing Linear Polymatroids , 2011, IEEE Transactions on Information Theory.

[16]  D. L. Johnson Presentations of groups , 1976 .

[17]  Babak Hassibi,et al.  Violating the Ingleton inequality with finite groups , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[18]  Pirita Paajanen Finite p-Groups, Entropy Vectors, and the Ingleton Inequality for Nilpotent Groups , 2014, IEEE Transactions on Information Theory.

[19]  Terence Chan,et al.  Group characterizable entropy functions , 2007, 2007 IEEE International Symposium on Information Theory.

[20]  Terence Chan On the optimality of group network codes , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[21]  Randall Dougherty,et al.  Characteristic-dependent linear rank inequalities and network coding applications , 2014, 2014 IEEE International Symposium on Information Theory.

[22]  Ryan Kinser,et al.  New inequalities for subspace arrangements , 2009, J. Comb. Theory, Ser. A.

[23]  Frédérique E. Oggier,et al.  Groups and information inequalities in 5 variables , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[24]  Hua Li,et al.  On Connections between Group Homomorphisms and the Ingleton Inequality , 2007, 2007 IEEE International Symposium on Information Theory.

[25]  T. Chan,et al.  Capacity regions for linear and abelian network codes , 2007, 2007 Information Theory and Applications Workshop.

[26]  Michael Aschbacher,et al.  Finite Group Theory , 1994 .

[27]  Randall Dougherty,et al.  Insufficiency of linear coding in network information flow , 2005, IEEE Transactions on Information Theory.

[28]  Zhen Zhang,et al.  A non-Shannon-type conditional inequality of information quantities , 1997, IEEE Trans. Inf. Theory.

[29]  Babak Hassibi,et al.  On group network codes: Ingleton-bound violations and independent sources , 2010, 2010 IEEE International Symposium on Information Theory.

[30]  M. Lunelli,et al.  Representation of matroids , 2002, math/0202294.

[31]  Nikolai K. Vereshchagin,et al.  Inequalities for Shannon Entropy and Kolmogorov Complexity , 1997, J. Comput. Syst. Sci..

[32]  Randall Dougherty,et al.  Networks, Matroids, and Non-Shannon Information Inequalities , 2007, IEEE Transactions on Information Theory.

[33]  Babak Hassibi,et al.  MCMC methods for entropy optimization and nonlinear network coding , 2010, 2010 IEEE International Symposium on Information Theory.

[34]  B. Hassibi,et al.  Cayley's hyperdeterminant, the principal minors of a symmetric matrix and the entropy region of 4 Gaussian random variables , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[35]  László Csirmaz,et al.  Entropy Region and Convolution , 2016, IEEE Transactions on Information Theory.