A Marginal Characterization of Entropy Functions for Conditional Mutually Independent Random Variables (with Application to Wyner's Common Information)
Conference Paper
Overview
Identity
Additional Document Info
View All
Overview
abstract
2015 IEEE. We prove that by imposing a conditional mutual independence constraint and a marginalisation constraint, the almost entropic region can be completely characterised by Shannon-type information inequalities. Such a property is applied to obtain an explicit lower bound on the generalised Wyner common information.
name of conference
2015 IEEE International Symposium on Information Theory (ISIT)