Measures of Information Leakage for Incomplete Statistical Information: Application to a Binary Privacy Mechanism

Author:

Sakib Shahnewaz Karim1ORCID,Amariucai George T2ORCID,Guan Yong1ORCID

Affiliation:

1. Iowa State University, USA

2. Kansas State University, USA

Abstract

Information leakage is usually defined as the logarithmic increment in the adversary’s probability of correctly guessing the legitimate user’s private data or some arbitrary function of the private data when presented with the legitimate user’s publicly disclosed information. However, this definition of information leakage implicitly assumes that both the privacy mechanism and the prior probability of the original data are entirely known to the attacker. In reality, the assumption of complete knowledge of the privacy mechanism for an attacker is often impractical. The attacker can usually have access to only an approximate version of the correct privacy mechanism, computed from a limited set of the disclosed data, for which they can access the corresponding un-distorted data. In this scenario, the conventional definition of leakage no longer has an operational meaning. To address this problem, in this article, we propose novel meaningful information-theoretic metrics for information leakage when the attacker has incomplete information about the privacy mechanism—we call them average subjective leakage , average confidence boost , and average objective leakage , respectively. For the simplest, binary scenario, we demonstrate how to find an optimized privacy mechanism that minimizes the worst-case value of either of these leakages.

Funder

NIST CSAFE

NSF

NPRP

Qatar National Research Fund

Publisher

Association for Computing Machinery (ACM)

Subject

Safety, Risk, Reliability and Quality,General Computer Science

Reference60 articles.

1. Jayadev Acharya, Ziteng Sun, and Huanyu Zhang. 2018. Differentially private testing of identity and closeness of discrete distributions. In Conference on Advances in Neural Information Processing Systems. 6878–6891.

2. Privacy and Security Issues in Online Social Networks

3. Mário S. Alvim, Miguel E. Andrés, Konstantinos Chatzikokolakis, Pierpaolo Degano, and Catuscia Palamidessi. 2011. Differential privacy: On the trade-off between utility and information leakage. In International Workshop on Formal Aspects in Security and Trust. Springer, 39–54.

4. A Game-Theoretic Approach to Information-Flow Control via Protocol Composition

5. Mário S. Alvim, Konstantinos Chatzikokolakis, Annabelle McIver, Carroll Morgan, Catuscia Palamidessi, and Geoffrey Smith. 2016. Axioms for information leakage. In EEE 29th Computer Security Foundations Symposium (CSF’16). IEEE, 77–92.

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3