January 2021
This paper is now forthcoming at Theoretical Economics.
Abstract
Modern information technologies make it possible to store, analyze and trade unprecedented amounts of detailed information about individuals. This has led to public discussions on whether individuals’ privacy should be better protected by restricting the amount or the precision of information that is collected by commercial institutions on their participants. We contribute to this discussion by proposing a Bayesian approach to measure loss of privacy in a mechanism. Specifically, we define the loss of privacy associated with a mechanism as the difference between the designer’s prior and posterior beliefs about an agent’s type, where this difference is calculated using Kullback-Leibler divergence, and where the change in beliefs is triggered by actions taken by the agent in the mechanism. We consider both ex-post (for every realized type, the maximal difference in beliefs cannot exceed some threshold k) and ex-ante (the expected difference in beliefs over all type realizations cannot exceed some threshold k) measures of privacy loss. Using these notions we study the properties of optimal privacy-constrained mechanisms and the relation between welfare/profits and privacy levels.
Sign up to receive email alerts when we publish a new working paper.