On September 29th, 2018, Harald Zwingelberg from SPECIAL partner ULD joined a panel discussion at FIfFKon 2018.
FIfFKon is the annual conference of the ‘Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung e.V.’, covering social, political, and peace-related aspects in the vicinity of computer science and informatics.
The German project AppPETs organized a workshop on FIfKon 2018 on data protection and new potentials for upcoming generations of smartphone apps as well as a project showcase for European and German projects related to data protection in the mobile word. Harald represented SPECIAL on the introductory panel ‘More data protection for smartphone apps - will GDPR bring the breakthrough?’.
The GDPR provides a start for data protection by design and default. The audience and discussants agreed that long and exhausting privacy policies are not the paradigm to aim for. The GDPR demands better information for the data subjects. Providing this information may be archived with layered policies where a short version contains core information and points to aspects specifically risky for data subjects or surprising for the average customer. This is one of the core aspects of the SPECIAL R&D work.
Art. 21 (5) GDPR and Art. 10 of the parliament's draft of the ePrivacy Regulation provide a base for a specification allowing exercising the right to object. But specifications can also be used to successfully communicate transparency information, to obtain informed consent, and to demonstrate compliance with legal requirements. Such specifications are currently under development in the W3C Data Privacy Vocabularies and Controls Community Group (DPVCG), with participation of ULD via the SPECIAL project.
Another aspect of the discussion addressed the transparency of algorithms. Controllers must provide transparency relevant information about their processing activities, are responsible for data protection compliance and must be able to demonstrate their compliance according to Art. 5 (2) GDPR. Yet, there are many legal and practical questions how controllers of self-learning systems can to demonstrate transparency. For example, regarding the decision logic of such systems, is a vivid, ongoing and still open discussion between practitioners and legal experts.