Fisher Information and Mutual Information Constraints

02/11/2021
by   Leighton Pate Barnes, et al.
0

We consider the processing of statistical samples X∼ P_θ by a channel p(y|x), and characterize how the statistical information from the samples for estimating the parameter θ∈ℝ^d can scale with the mutual information or capacity of the channel. We show that if the statistical model has a sub-Gaussian score function, then the trace of the Fisher information matrix for estimating θ from Y can scale at most linearly with the mutual information between X and Y. We apply this result to obtain minimax lower bounds in distributed statistical estimation problems, and obtain a tight preconstant for Gaussian mean estimation. We then show how our Fisher information bound can also imply mutual information or Jensen-Shannon divergence based distributed strong data processing inequalities.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro