As soon as personal data is processed, European data protection law (esp. the GDPR) provides very strict rules that must be observed by data controllers and processors. This leads to a variety of problems, especially in the case of artificial intelligence (AI) systems and smart robotics, as the GDPR takes these new technologies insufficiently into account. By introducing the method of “Anonymity Assessment,” we propose an interdisciplinary approach to classifying anonymity and measuring the degree of pseudo-anonymization of a given data set in a legal and technical sense. The legal provisions of GDPR are therefore “translated” into mathematical equations. To this end, we propose two scores: the Objective Anonymity Score (OAS), which determines the risk of (re-)identifying a natural person under objective statistical measures; and the Subjective Anonymity Score (SAS), which takes into account the subjective perspective of data controllers or processors.
Issue Paper 18-15-5. - T.044)861 - 0300 F.044)868 - 9913 - published by the Korean Legislation Research Institue - ISBN 978-89-6684-893-5
in: W. Prinz & P. Hoschka (Eds.), Proceedings of the 1st ERCIM Blockchain Workshop 2018, Reports of the European Society for Socially Embedded Technologies, Amsterdam 2018, (DOI: http://dx.doi.org/10.18420/blockchain2018_03).
in: Taeger, Jürgen (Hrsg.), Recht 4.0, Innovationen aus den rechtswissenschaftlichen Laboren, S. 833 – 845, Oldenburger Verlag für Wirtschaft, Informatik und Recht, Oldenburg 2017.
in: Hill/Martini/Kugelmann, Perspektiven der digitalen Lebenswelt, S. 147 – 162, Nomos-Verlag, Baden-Baden 2017.
in: 24. DGRI Drei-Länder-Treffen, Computer&Recht (CR) 2017, R88 – R89
in: Verwaltung & Management 2016, S. 328 – 333