Abstract
The use of Privacy-Enhancing Technologies in the field of data anonymisation and pseudonymisation raises a lot of questions with respect to legal compliance under GDPR and current international data protection legislation. Here, especially the use of innovative technologies based on machine learning may increase or decrease risks to data protection. A workshop held at the IFIP Summer School on Privacy and Identity Management showed the complexity of this field and the need for further interdisciplinary research on the basis of an improved joint understanding of legal and technical concepts.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Note that the GDPR defines the process of “pseudonymisation” with the outcome of “pseudonymised data” which is a subset of all kinds of “pseudonymous data” where the identity of the data subjects is hidden to some extent.
- 2.
References
Agencia Española de Protección de Datos and European Data Protection Supervisor. 10 misunderstandings related to anonymization (2021). https://edps.europa.eu/system/files/2021-04/21-04-27_aepd-edps_anonymisation_en_5.pdf
European Data Protection Board. Guidelines 4/2019 on article 25. data protection by design and by default (2020). https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en
Bruegger, B.P.: Towards a better understanding of identification, pseudonymization, and anonymization (2021). https://uld-sh.de/PseudoAnon
Bygrave, L.: Data protection by design and by default: deciphering the EU’s legislative requirements. Oslo Law Rev. 4(2), 105–120 (2017)
Danezis, G., et al.: Privacy and Data Protection by Design - from policy to engineering. Technical report, ENISA (2014). ISBN 978-92-9204-108-3. https://doi.org/10.2824/38623. https://www.enisa.europa.eu/activities/identity-and-trust/library/deliverables/privacy-and-data-protection-by-design
European Data Protection Board. Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data (2020). https://edpb.europa.eu/system/files/2021-06/edpb_recommendations_202001vo.2.0_supplementarymeasurestransferstools_en.pdf
Finck, M., Pallas, F.: They who must not be identified-distinguishing personal from non-personal data under the GDPR. Int. Data Privacy Law 10(1), 11–36 (2020). https://doi.org/10.1093/idpl/ipz026. ISSN 2044-3994
Hoepman, J.-H.: Privacy design strategies. In: Cuppens-Boulahia, N., Cuppens, F., Jajodia, S., Abou El Kalam, A., Sans, T. (eds.) SEC 2014. IAICT, vol. 428, pp. 446–459. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-642-55415-5_38
Hoepman, J.-H.: Privacy design strategies. The little blue book (2018). https://www.cs.ru.nl/jhh/publications/pds-booklet.pdf
Jasmontaite, L., Kamara, I., Zanfir-Fortuna, G., Leucci, S.: Data protection by design and by default: framing guiding principles into legal obligations in the GDPR. Eur. Data Prot. Law Rev. 4(2), 168–189 (2018)
Karegar, F., Alaqra, A.S., Fischer-Hübner, S.: Exploring user-suitable metaphors for differentially private data analyses. In: Eighteenth Symposium on Usable Privacy and Security (SOUPS 2022), Boston, MA, pp. 175–193. USENIX Association (2022). https://www.usenix.org/conference/soups2022/presentation/karegar. ISBN 978-1-939133-30-4
Nanayakkara, P., Bater, J., He, X., Hullman, J., Rogers, J.: Visualizing privacy-utility trade-offs in differentially private data releases. arXiv preprint arXiv:2201.05964 (2022)
Pfitzmann, A., Hansen, M.: A terminology for talking about privacy by data minimization: anonymity, unlinkability, undetectability, unobservability, pseudonymity, and identity management, vol. 34 (2010). http://dud.inf.tu-dresden.de/literatur/Anon_Terminology_v0.34.pdf
Shokri, R., Stronati, M., Song, C., Shmatikov, V.: Membership inference attacks against machine learning models. In: 2017 IEEE Symposium on Security and Privacy (SP), pp. 3–18. IEEE (2017)
Stalla-Bourdillon, S., Rossi, A.: Why a good additional technical safeguard is hard to find–a response to the consultation on the EDPB draft recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data (2020). https://edpb.europa.eu/sites/default/files/webform/public_consultation_reply/response_sto_edpb_recommendations.pdf
Wu, J., Zappala, D.: When is a tree really a truck? Exploring mental models of encryption. In: Fourteenth Symposium on Usable Privacy and Security (SOUPS 2018), pp. 395–409 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 IFIP International Federation for Information Processing
About this paper
Cite this paper
Fischer-Hübner, S., Hansen, M., Hoepman, JH., Jensen, M. (2023). Privacy-Enhancing Technologies and Anonymisation in Light of GDPR and Machine Learning. In: Bieker, F., Meyer, J., Pape, S., Schiering, I., Weich, A. (eds) Privacy and Identity Management. Privacy and Identity 2022. IFIP Advances in Information and Communication Technology, vol 671. Springer, Cham. https://doi.org/10.1007/978-3-031-31971-6_2
Download citation
DOI: https://doi.org/10.1007/978-3-031-31971-6_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-31970-9
Online ISBN: 978-3-031-31971-6
eBook Packages: Computer ScienceComputer Science (R0)