The full title of the workshop was ‘Towards Value-Centric Big Data: Connect People, Processes and Technology’ and it dealt with the question of how to make data processing privacy-preserving, or more in general: how to collect data in an ethical way. The workshop was held at the Vrije Universiteit Brussel (VUB, Free University of Brussels). It consisted of two parts: invited talks in the morning and discussion sessions in the afternoon.
Seven 15-minute presentations made up the morning session.
Saila Rinne – Privacy-enhancing technologies, data sharing and ethics
The first talk was by Saila Rinne, Programme Officer in the Data Policy and Innovation Unit of the European Commission's ‘DG Connect’. She stressed the importance the EU puts on ethics and that, as a result, the EC favours funding projects that include ethical questions in their research. Those questions go beyond just privacy. They include the treatment of humans in general, but also animals, including in non-EU countries. Of course, AI is a topic that gets special attention at the moment.
Duncan Brown – “You’re monitoring my what…?!” - Balancing privacy against enhanced security outcomes
Duncan Brown is Chief Security Officer of Forcepoint. His talk was about measuring security risks. Every data-processing step carries risks and they add up. More than half the data breaches are not the result of active, outside attacks or malicious actions by employees, but result from employees trying to do the right thing but failing, because processes are unclear or ill-adapted. His advice is to keep security monitoring strictly separate from other measurements and not to try and use security monitoring at the same time for, e.g., performance monitoring.
Marina da Bromida – A “win-win” initiative for value-centric big data and safeguarding “privacy and ethical values” in the PSPS domain: AEGIS project
Marina da Bromida, a lawyer in the AEGIS project, gave the third talk. The AEGIS project, among other things, looks at ethics in data sharing. Her advice is that every project or company that collects personal data needs an Ethics Advisory Board with at least three members.
Alessandro Bruni – Safe and secure data marketplaces for innovation: SAFE-DEED
Alessandro Bruni, Legal Researcher at the KU Leuven, presented the SAFE-DEED project. He studies how to measure the value of privacy-preserving data-sharing technologies. There are technical aspects (the tools), but also legal and ethical aspects (education). Providing ways to trust data makes economic sense. The challenge is to allow competition.
Barbara Giovanelli – The dangers of tech-determinism: Demystifying AI and reclaiming the future
The fifth talk was given by Barbara Giovanelli, who works for the EDPS. She asked for attention for the ethical and moral aspects of using AI. Current AI is statistics over big data. Certainly a useful tool, but, like all statistics, it basically is a tool to make categories. And thus we should be clear what categories we want. An ethics committee may help, if we define its role well and have a process to make it effective. With that, could privacy and trust in data be the competitive edge for European companies?
Ansar Yasar – Responsible Research: Analytics when dealing with personal and personalised mobility data: Track&Know
Ansar Yasar, of the Track&Know project, talked about ‘Responsible Research’. The Track&Know project is about improving efficiency of big data applications. To ensure security and data protection, it is necessary to train the staff who handles the data. And it is necessary to watch out for biases in the processes.
Tjerk Timan – Big Data Value Association Position Paper on PPTs
Tjerk Timan works for TNO and also for the BDVe project and the BDVA. He talked about an old problem: ‘sticky policies’ and the question of how they can be changed over time. But also about a new problem: distributed identities (or self-sovereign identities) and if people are able to understand and manage them. Common to those and other technologies is the question of maintaining trust in data when the data is processed by an untrusted third party. The BDVA is working on a white paper around these topics.
The afternoon was dedicated to discussions in small groups around four topics. Each group of four to six people sat at a table with two discussion leaders, who asked them questions. After 30 minutes each group moved to the next table, so that after two hours all groups had discussed all topics. At the end, the discussion leaders summarized the answers that each group had given them. The organisers referred to this scheme as a ‘knowledge café’.
‘Develop a common ethical and legal framework for responsible innovation by privacy-preserving technologies’
The discussions led to some suggestions, including setting up programmes of certification and incentives for those. Which in turn requires a set of standards to certify against. A similar idea is regular audits, the reports of which should be public, for more trust and verifiability.
it will take time for people to understand the issues and the guarantees given by such certifications. Maybe there have to be a few high-profile court cases first.
‘Define design requirements for big data solutions that lead to a more responsible use of big data’
The first requirement is that the roles of the various people involved in the development of applications is clear, from the software developers to the managers. Companies also need a way to put a value on privacy-preserving technology.
Research is needed and thus incentives for doing research. One question is how to avoid that metadata (provenance, ‘sticky policies’) is lost. Part of the answer may be in better usability.
It helps to have diversity in the company: legal, technical and social skills. But also people, not necessarily experts in any of these, who are able to translate between them.
‘How to embed accountability, transparency and responsibility in company processes?’
An ethics committee (ethics board) can be useful. It can be internal or external to the company. The ethics committee needs to evaluate practice against standards, but standards are difficult to make: What is considered ethical changes over time and by country. A European standard might be possible, though.
The ethics committee, or whoever applies the standards, might work with a checklist or some form of template. It would also be good to have an easy way to share best practices.
There may also be a role for NGO's as watchdogs. The legal environment should be made such that they can easily do their work.
‘What about the business? How to balance business and ethical objectives?’
This discussion was about the economics: the costs and the benefits. Standards, again, may reduce the costs. A way to measure the value of data and of data protection is needed. But the value is not necessarily the same in all branches of industry.
‘Naming and shaming’ is a way to raise the cost of unethical processes. Publicizing success stories increases their value.
Shared data platforms that implement good practice can provide an infrastructure and increase the value of applications due to a network effect. They can also help to introduce competition.