Abstract
One of the recent notions of privacy protection is Differential Privacy (DP) with potential application in several personal data protection settings. DP acts as an intermediate layer between a private dataset and data analysts introducing privacy by injecting noise into the results of queries. Key to DP is the role of \( \varepsilon \) – a parameter that controls the magnitude of injected noise and, therefore, the trade-off between utility and privacy. Choosing proper \( \varepsilon \) value is a key challenge and a non-trivial task, as there is no straightforward way to assess the level of privacy loss associated with a given \(\varepsilon \) value. In this study, we measure the privacy loss imposed by a given \(\varepsilon \) through an adversarial model that exploits auxiliary information. We define the adversarial model and the privacy loss based on a differencing attack and the success probability of such an attack, respectively. Then, we restrict the probability of a successful differencing attack by tuning the \(\varepsilon \). The result is an approach for setting \(\varepsilon \) based on the probability of a successful differencing attack and, hence, privacy leak. Our evaluation finds that setting \(\varepsilon \) based on some of the approaches presented in related work does not seem to offer adequate protection against the adversarial model introduced in this paper. Furthermore, our analysis shows that the \(\varepsilon \) selected by our proposed approach provides privacy protection for the adversary model in this paper and the adversary models in the related work.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
One may consider querying directly for Alice’s record as
SELECT COUNT(HashID) FROM dataset
WHERE Gender = ‘female’ AND CHANNEL_1 = 1 AND
Language = ‘English’ AND Location = ‘Austria’.
A Protected dataset may, however, not allow querying over a subset smaller than some threshold. The result is that most differecing attack scenarios in the literature only consider two queries.
- 2.
Many users tend to select the year 1900 as their birth year. Whilst this is unlikely to be the actual birth year of an user, we assume that this entry is still worthy of hiding, as it exposes an habit of the target person.
References
Beck, M., Bhatotia, P., Chen, R., Fetzer, C., Strufe, T., et al.: PrivApprox: privacy-preserving stream analytics. In: 2017 USENIX Annual Technical Conference USENIX ATC 2017, pp. 659–672 (2017)
Clopper, C.J., Pearson, E.S.: The use of confidence or fiducial limits illustrated in the case of the binomial. Biometrika 26(4), 404–413 (1934)
Dinur, I., Nissim, K.: Revealing information while preserving privacy. In: Proceedings of the Twenty-Second ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems, pp. 202–210 (2003)
Dwork, C., Roth, A., et al.: The algorithmic foundations of differential privacy. Found. Trends Theor. Comput. Sci. 9(3–4), 211–407 (2014)
Gaboardi, M., et al.: Psi (\(\psi \)): a private data sharing interface. arXiv preprint arXiv:1609.04340 (2016)
Garfinkel, S.L., Abowd, J.M., Powazek, S.: Issues encountered deploying differential privacy. In: Proceedings of the 2018 Workshop on Privacy in the Electronic Society, pp. 133–137 (2018)
Hsu, J., et al.: Differential privacy: an economic method for choosing epsilon. In: 2014 IEEE 27th Computer Security Foundations Symposium, pp. 398–410. IEEE (2014)
Johnson, N., Near, J.P., Hellerstein, J.M., Song, D.: Chorus: a programming framework for building scalable differential privacy mechanisms. In: 2020 IEEE European Symposium on Security and Privacy (EuroS&P), pp. 535–551. IEEE (2020)
Johnson, N., Near, J.P., Song, D.: Towards practical differential privacy for SQL queries. Proc. VLDB Endow. 11(5), 526–539 (2018)
Kotz, S., Kozubowski, T., Podgorski, K.: The Laplace Distribution and Generalizations: A Revisit with Applications to Communications, Economics, Engineering, and Finance. Springer, Heidelberg (2012). https://doi.org/10.1007/978-1-4612-0173-1
Krehbiel, S.: Choosing epsilon for privacy as a service. Proc. Priv. Enhanc. Technol. 2019(1), 192–205 (2019)
Latanya, S.: k-anonymity: A model for protecting privacy. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 10(05), 557–570 (2002)
Lee, J., Clifton, C.: How much is enough? Choosing \( \varepsilon \) for differential privacy. In: Lai, X., Zhou, J., Li, H. (eds.) ISC 2011. LNCS, vol. 7001, pp. 325–340. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-24861-0_22
Li, N., Lyu, M., Su, D., Yang, W.: Differential privacy: from theory to practice. Synth. Lect. Inf. Secur. Priv. Trust 8(4), 1–138 (2016)
Liu, C., He, X., Chanyaswad, T., Wang, S., Mittal, P.: Investigating statistical privacy frameworks from the perspective of hypothesis testing. Proc. Priv. Enhanc. Technol. 2019(3), 233–254 (2019)
McSherry, F.D.: Privacy integrated queries: an extensible platform for privacy-preserving data analysis. In: Proceedings of the 2009 ACM SIGMOD International Conference on Management of Data, pp. 19–30. ACM (2009)
Nadarajah, S.: The linear combination, product and ratio of Laplace random variables. Statistics 41(6), 535–545 (2007)
Proserpio, D., Goldberg, S., McSherry, F.: Calibrating data to sensitivity in private data analysis: a platform for differentially-private analysis of weighted datasets. Proc. VLDB Endow. 7(8), 637–648 (2014)
Roy, I., Setty, S.T., Kilzer, A., Shmatikov, V., Witchel, E.: Airavat: security and privacy for mapreduce. In: NSDI, vol. 10, pp. 297–312 (2010)
Tao, Y., He, X., Machanavajjhala, A., Roy, S.: Computing local sensitivities of counting queries with joins. In: Proceedings of the 2020 ACM SIGMOD International Conference on Management of Data, pp. 479–494 (2020)
Wagner, I., Eckhoff, D.: Technical privacy metrics: a systematic survey. ACM Comput. Surv. (CSUR) 51(3), 1–38 (2018)
Wood, A., et al.: Differential privacy: a primer for a non-technical audience. Vand. J. Ent. Tech. L. 21, 209 (2018)
Acknowledgements
We thank the Swiss National Science Foundation for their partial support under contract number \(\#407550\_167177\).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering
About this paper
Cite this paper
Ashena, N., Dell’Aglio, D., Bernstein, A. (2021). Understanding \( \varepsilon \) for Differential Privacy in Differencing Attack Scenarios. In: Garcia-Alfaro, J., Li, S., Poovendran, R., Debar, H., Yung, M. (eds) Security and Privacy in Communication Networks. SecureComm 2021. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 398. Springer, Cham. https://doi.org/10.1007/978-3-030-90019-9_10
Download citation
DOI: https://doi.org/10.1007/978-3-030-90019-9_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-90018-2
Online ISBN: 978-3-030-90019-9
eBook Packages: Computer ScienceComputer Science (R0)