The Compound Information Bottleneck Outlook

05/09/2022
βˆ™
by   Michael Dikshtein, et al.
βˆ™
0
βˆ™

We formulate and analyze the compound information bottleneck programming. In this problem, a Markov chain 𝖷→𝖸→𝖹 is assumed with fixed marginal distributions 𝖯_𝖷 and 𝖯_𝖸, and the mutual information between 𝖷 and 𝖹 is sought to be maximized over the choice of conditional probability of 𝖹 given 𝖸 from a given class, under the worst choice of the joint probability of the pair (𝖷,𝖸) from a different class. We consider several classes based on extremes of: mutual information; minimal correlation; total variation; and the relative entropy class. We provide values, bounds, and various characterizations for specific instances of this problem: the binary symmetric case, the scalar Gaussian case, the vector Gaussian case and the symmetric modulo-additive case. Finally, for the general case, we propose a Blahut-Arimoto type of alternating iterations algorithm to find a consistent solution to this problem.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro