Unifying the BrascampLieb Inequality and the Entropy Power Inequality
Abstract
The entropy power inequality (EPI) and the BrascampLieb inequality (BLI) are fundamental inequalities concerning the differential entropies of linear transformations of random vectors. The EPI provides lower bounds for the differential entropy of linear transformations of random vectors with independent components. The BLI, on the other hand, provides upper bounds on the differential entropy of a random vector in terms of the differential entropies of some of its linear transformations. In this paper, we define a family of entropy functionals, which we show are subadditive. We then establish that Gaussians are extremal for these functionals by mimicking the idea in Geng and Nair (2014). As a consequence, we obtain a new entropy inequality that generalizes both the BLI and EPI. By considering a variety of independence relations among the components of the random vectors appearing in these functionals, we also obtain families of inequalities that lie between the EPI and the BLI.
 Publication:

arXiv eprints
 Pub Date:
 January 2019
 arXiv:
 arXiv:1901.06619
 Bibcode:
 2019arXiv190106619A
 Keywords:

 Computer Science  Information Theory;
 Mathematics  Probability;
 94A17
 EPrint:
 38 pages, 1 figure. Submitted to the IEEE Transactions on Information Theory for possible publication