Affiliation:
1. CentraleSupélec Rennes, Rennes, France
2. AIST, Tsukuba, Japan
3. Inria, Rennes, France
Abstract
Abstract
Analysis of a probabilistic system often requires to learn the joint probability distribution of its random variables. The computation of the exact distribution is usually an exhaustive
precise analysis
on all executions of the system. To avoid the high computational cost of such an exhaustive search,
statistical analysis
has been studied to efficiently obtain approximate estimates by analyzing only a small but representative subset of the system’s behavior. In this paper we propose a
hybrid statistical estimation method
that combines precise and statistical analyses to estimate mutual information, Shannon entropy, and conditional entropy, together with their confidence intervals. We show how to combine the analyses on different components of a discrete system with different accuracy to obtain an estimate for the whole system. The new method performs weighted statistical analysis with different sample sizes over different components and dynamically finds their optimal sample sizes. Moreover, it can reduce sample sizes by using prior knowledge about systems and a new
abstraction-then-sampling
technique based on qualitative analysis. To apply the method to the source code of a system, we show how to decompose the code into components and to determine the analysis method for each component by overviewing the implementation of those techniques in the HyLeak tool. We demonstrate with case studies that the new method outperforms the state of the art in quantifying information leakage.
Funder
Japan Society for the Promotion of Science
JSPS & Inria
Publisher
Association for Computing Machinery (ACM)
Subject
Theoretical Computer Science,Software
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献