Does the GDPR have trust issues?

Trix Mulder, Reinder Broekstra

Research output: Contribution to conferenceAbstractAcademic


Until now, the processing of personal data in medical practice has been mostly in a closed context, meaning it was clear how the data flowed. The arrival of new technologies, such as eHealth and mobile health, have impacted the way personal data is collected in medical practice. A first review of the General Data Protection Regulation (GDPR) shows that it is mostly drafted for processing health data in a closed medical context. Therefore, this paper analyses where the shortcomings of the GDPR exist in meeting the new, more open context of data processing in medical practice. It is important that this research is being done, since the shortcomings possibly affect the core element, e.g. trust of patients, in the health care system. Furthermore it affects medical paradigms, such as medical confidentiality and informed consent (Ioannidis 2013). Although medical information used to exist in a closed medical context being controlled by governmental institutes, health data now exists in a medical context with commercial organisations playing an important role. Medical practice can use increasingly commercial information technologies for treatment plans and data collection, such as decision support systems, apps and wearables. Moreover, citizens are able to track and acquire their own health data via commercial organisations (e.g. Fitbit, The use of heterogeneous data sources breaks boundaries creating new information flows and risks for privacy (Barocas & Nissenbaum 2014). It is, however, the question whether the privacy of these data flows are sufficiently protected by the new legal framework of the GDPR, which came into effect 25 May 2018. Article 9 GDPR determines that personal data which are, by their nature, particularly sensitive in relation to these fundamental rights and freedoms, merit specific protection and are referred to as ‘sensitive data’. Health data is part of these special categories of data, since health data comes within a person’s most intimate sphere. Unauthorised disclosure may, for example, lead to various forms of discrimination and violation of fundamental rights. The GDPR prohibits, in principle, the processing of sensitive health data, unless one of the exemptions mentioned in paragraph 2 apply. One of the exemptions is informed consent, but privacy policies providing the information are ineffective in providing clear information about disclosure (Mulder 2019). This requires more trust in organisations and the medical context, in particular for health data and collected via (commercial) apps and wearables. Trust which could, for instance, be achieved via an adjustment of the current informed consent practice. One of the options is a more dynamic consent mechanism (Broekstra 2017), which this paper will investigate. Although the open heterogeneous context for data collection brings new opportunities for improving health, the potential lack of clarity in the GDPR and privacy policies put medical paradigms at risk. Aim and use of data collections, limitations of disclosure of information are crucial for the formation of trust in the medical context (Broekstra, Aris-Meijer, Maeckelberghe, Otten, Stolk 2019). As long as the necessary trust is not accompanied by clear checks and balances defining the trustworthiness of organisations and technologies within the medical context, a dangerous game of trust is played that can have unpredictable outcomes for the medical context itself.
Original languageEnglish
Publication statusPublished - 2019
EventBritish and Irish Law Education and Technology Association Annual conference 2019: Back to the futures: law without frontiers - Queen's University Belfast, Belfast, Ireland
Duration: 15-Apr-201916-Apr-2019


ConferenceBritish and Irish Law Education and Technology Association Annual conference 2019
Abbreviated titleBILETA Annual Conference 2019
Internet address

Cite this