Skip to main content

Understanding \( \varepsilon \) for Differential Privacy in Differencing Attack Scenarios

  • Conference paper
  • First Online:
Security and Privacy in Communication Networks (SecureComm 2021)

Abstract

One of the recent notions of privacy protection is Differential Privacy (DP) with potential application in several personal data protection settings. DP acts as an intermediate layer between a private dataset and data analysts introducing privacy by injecting noise into the results of queries. Key to DP is the role of \( \varepsilon \) – a parameter that controls the magnitude of injected noise and, therefore, the trade-off between utility and privacy. Choosing proper \( \varepsilon \) value is a key challenge and a non-trivial task, as there is no straightforward way to assess the level of privacy loss associated with a given \(\varepsilon \) value. In this study, we measure the privacy loss imposed by a given \(\varepsilon \) through an adversarial model that exploits auxiliary information. We define the adversarial model and the privacy loss based on a differencing attack and the success probability of such an attack, respectively. Then, we restrict the probability of a successful differencing attack by tuning the \(\varepsilon \). The result is an approach for setting \(\varepsilon \) based on the probability of a successful differencing attack and, hence, privacy leak. Our evaluation finds that setting \(\varepsilon \) based on some of the approaches presented in related work does not seem to offer adequate protection against the adversarial model introduced in this paper. Furthermore, our analysis shows that the \(\varepsilon \) selected by our proposed approach provides privacy protection for the adversary model in this paper and the adversary models in the related work.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    One may consider querying directly for Alice’s record as

    SELECT COUNT(HashID) FROM dataset

    WHERE Gender = ‘female’ AND CHANNEL_1 = 1 AND

    Language = ‘English’ AND Location = ‘Austria’.

    A Protected dataset may, however, not allow querying over a subset smaller than some threshold. The result is that most differecing attack scenarios in the literature only consider two queries.

  2. 2.

    Many users tend to select the year 1900 as their birth year. Whilst this is unlikely to be the actual birth year of an user, we assume that this entry is still worthy of hiding, as it exposes an habit of the target person.

References

  1. Beck, M., Bhatotia, P., Chen, R., Fetzer, C., Strufe, T., et al.: PrivApprox: privacy-preserving stream analytics. In: 2017 USENIX Annual Technical Conference USENIX ATC 2017, pp. 659–672 (2017)

    Google Scholar 

  2. Clopper, C.J., Pearson, E.S.: The use of confidence or fiducial limits illustrated in the case of the binomial. Biometrika 26(4), 404–413 (1934)

    Article  Google Scholar 

  3. Dinur, I., Nissim, K.: Revealing information while preserving privacy. In: Proceedings of the Twenty-Second ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems, pp. 202–210 (2003)

    Google Scholar 

  4. Dwork, C., Roth, A., et al.: The algorithmic foundations of differential privacy. Found. Trends Theor. Comput. Sci. 9(3–4), 211–407 (2014)

    MathSciNet  MATH  Google Scholar 

  5. Gaboardi, M., et al.: Psi (\(\psi \)): a private data sharing interface. arXiv preprint arXiv:1609.04340 (2016)

  6. Garfinkel, S.L., Abowd, J.M., Powazek, S.: Issues encountered deploying differential privacy. In: Proceedings of the 2018 Workshop on Privacy in the Electronic Society, pp. 133–137 (2018)

    Google Scholar 

  7. Hsu, J., et al.: Differential privacy: an economic method for choosing epsilon. In: 2014 IEEE 27th Computer Security Foundations Symposium, pp. 398–410. IEEE (2014)

    Google Scholar 

  8. Johnson, N., Near, J.P., Hellerstein, J.M., Song, D.: Chorus: a programming framework for building scalable differential privacy mechanisms. In: 2020 IEEE European Symposium on Security and Privacy (EuroS&P), pp. 535–551. IEEE (2020)

    Google Scholar 

  9. Johnson, N., Near, J.P., Song, D.: Towards practical differential privacy for SQL queries. Proc. VLDB Endow. 11(5), 526–539 (2018)

    Article  Google Scholar 

  10. Kotz, S., Kozubowski, T., Podgorski, K.: The Laplace Distribution and Generalizations: A Revisit with Applications to Communications, Economics, Engineering, and Finance. Springer, Heidelberg (2012). https://doi.org/10.1007/978-1-4612-0173-1

    Book  MATH  Google Scholar 

  11. Krehbiel, S.: Choosing epsilon for privacy as a service. Proc. Priv. Enhanc. Technol. 2019(1), 192–205 (2019)

    MathSciNet  Google Scholar 

  12. Latanya, S.: k-anonymity: A model for protecting privacy. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 10(05), 557–570 (2002)

    Article  MathSciNet  Google Scholar 

  13. Lee, J., Clifton, C.: How much is enough? Choosing \( \varepsilon \) for differential privacy. In: Lai, X., Zhou, J., Li, H. (eds.) ISC 2011. LNCS, vol. 7001, pp. 325–340. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-24861-0_22

    Chapter  Google Scholar 

  14. Li, N., Lyu, M., Su, D., Yang, W.: Differential privacy: from theory to practice. Synth. Lect. Inf. Secur. Priv. Trust 8(4), 1–138 (2016)

    Google Scholar 

  15. Liu, C., He, X., Chanyaswad, T., Wang, S., Mittal, P.: Investigating statistical privacy frameworks from the perspective of hypothesis testing. Proc. Priv. Enhanc. Technol. 2019(3), 233–254 (2019)

    Google Scholar 

  16. McSherry, F.D.: Privacy integrated queries: an extensible platform for privacy-preserving data analysis. In: Proceedings of the 2009 ACM SIGMOD International Conference on Management of Data, pp. 19–30. ACM (2009)

    Google Scholar 

  17. Nadarajah, S.: The linear combination, product and ratio of Laplace random variables. Statistics 41(6), 535–545 (2007)

    Article  MathSciNet  Google Scholar 

  18. Proserpio, D., Goldberg, S., McSherry, F.: Calibrating data to sensitivity in private data analysis: a platform for differentially-private analysis of weighted datasets. Proc. VLDB Endow. 7(8), 637–648 (2014)

    Article  Google Scholar 

  19. Roy, I., Setty, S.T., Kilzer, A., Shmatikov, V., Witchel, E.: Airavat: security and privacy for mapreduce. In: NSDI, vol. 10, pp. 297–312 (2010)

    Google Scholar 

  20. Tao, Y., He, X., Machanavajjhala, A., Roy, S.: Computing local sensitivities of counting queries with joins. In: Proceedings of the 2020 ACM SIGMOD International Conference on Management of Data, pp. 479–494 (2020)

    Google Scholar 

  21. Wagner, I., Eckhoff, D.: Technical privacy metrics: a systematic survey. ACM Comput. Surv. (CSUR) 51(3), 1–38 (2018)

    Article  Google Scholar 

  22. Wood, A., et al.: Differential privacy: a primer for a non-technical audience. Vand. J. Ent. Tech. L. 21, 209 (2018)

    Google Scholar 

Download references

Acknowledgements

We thank the Swiss National Science Foundation for their partial support under contract number \(\#407550\_167177\).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Narges Ashena .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ashena, N., Dell’Aglio, D., Bernstein, A. (2021). Understanding \( \varepsilon \) for Differential Privacy in Differencing Attack Scenarios. In: Garcia-Alfaro, J., Li, S., Poovendran, R., Debar, H., Yung, M. (eds) Security and Privacy in Communication Networks. SecureComm 2021. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 398. Springer, Cham. https://doi.org/10.1007/978-3-030-90019-9_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-90019-9_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-90018-2

  • Online ISBN: 978-3-030-90019-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy