Asymptotic Expansions of Smooth Rényi Entropies and Their Applications

03/11/2020
by   Yuta Sakai, et al.
0

This study considers the unconditional smooth Rényi entropy, the smooth conditional Rényi entropy proposed by Kuzuoka [IEEE Trans. Inf.Theory, vol. 66, no. 3, pp. 1674–1690, 2020], and a new quantity which we term the conditional smooth Rényi entropy. In particular, we examine asymptotic expansions of these entropies when the underlying source with its side-information is stationary and memoryless. Using these smooth Rényi entropies, we establish one-shot coding theorems of several information-theoretic problems: Campbell's source coding, guessing problems, and task encoding problems, all allowing errors. In each problem, we consider two error formalisms: the average and maximum error criteria, where the averaging and maximization are taken with respect to the side-information of the source. Applying our asymptotic expansions to the derived one-shot coding theorems, we derive various asymptotic fundamental limits for these problems when their error probabilities are allowed to be non-vanishing. We show that, in non-degenerate settings, the first-order fundamental limits differ under the average and maximum error criteria. This is in contrast to a different but related setting considered by the present authors (for variable-length conditional source coding allowing errors) in which the first-order terms are identical but the second-order terms are different under these criteria.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro