Affiliation:
1. Center for Data Science, Waseda University, Tokyo 169-8050, Japan
2. Department of Computer and Network Engineering, The University of Electro-Communications, Tokyo 182-8585, Japan
Abstract
Two typical fixed-length random number generation problems in information theory are considered for general sources. One is the source resolvability problem and the other is the intrinsic randomness problem. In each of these problems, the optimum achievable rate with respect to the given approximation measure is one of our main concerns and has been characterized using two different information quantities: the information spectrum and the smooth Rényi entropy. Recently, optimum achievable rates with respect to f-divergences have been characterized using the information spectrum quantity. The f-divergence is a general non-negative measure between two probability distributions on the basis of a convex function f. The class of f-divergences includes several important measures such as the variational distance, the KL divergence, the Hellinger distance and so on. Hence, it is meaningful to consider the random number generation problems with respect to f-divergences. However, optimum achievable rates with respect to f-divergences using the smooth Rényi entropy have not been clarified yet in both problems. In this paper, we try to analyze the optimum achievable rates using the smooth Rényi entropy and to extend the class of f-divergence. To do so, we first derive general formulas of the first-order optimum achievable rates with respect to f-divergences in both problems under the same conditions as imposed by previous studies. Next, we relax the conditions on f-divergence and generalize the obtained general formulas. Then, we particularize our general formulas to several specified functions f. As a result, we reveal that it is easy to derive optimum achievable rates for several important measures from our general formulas. Furthermore, a kind of duality between the resolvability and the intrinsic randomness is revealed in terms of the smooth Rényi entropy. Second-order optimum achievable rates and optimistic achievable rates are also investigated.
Funder
JSPS KAKENHI
Kayamori Foundation of Informational Science Advancement
Reference35 articles.
1. Approximation theory of output statistics;Han;IEEE Trans. Inf. Theory,1993
2. Simulation of random processes and rate-distortion theory;Steinberg;IEEE Trans. Inf. Theory,1996
3. Nomura, R. (2018, January 17–22). Source resolvability with Kullback-Leibler divergence. Proceedings of the 2018 IEEE International Symposium on Information Theory, Vail, CO, USA.
4. Source resolvability and intrinsic randomness: Two random number generation problems with respect to a subclass of f-divergences;Nomura;IEEE Trans. Inf. Theory,2020
5. Second-order resolvability, intrinsic randomness, and fixed-length source coding for mixed sources: Information spectrum approach;Nomura;IEEE Trans. Inf. Theory,2013