Skip to content

Looking Through Black Boxes in Medical Diagnosis: Is the Upcoming Three-Dimensional European Regulatory Framework Ready, Willing and Able?

Christina Varytimidou

DOI https://doi.org/10.21552/EHPL/2022/1/5



Achieving transparency in AI-based systems is a key step towards trustworthy artificial intelligence that will benefit humanity. Especially in deep learning systems that will soon be deployed in medical diagnosis to assist physicians, the opacity of ‘black-box’ systems endangers the fiduciary relationship between the physician and the patient, risks the individual right to personal data protection and adds a further layer of complexity in medical liability allocation. This paper is in constant search of transparency and explainability requirements and aims to assess the three-dimensional regulatory framework that will soon be in force within the E.U. After identifying the transparency requirements established under the General Data Protection Regulation, the Medical Devices Regulation and the proposed Artificial Intelligence Act, and critically analysing them, the paper will then identify the root of the problem in the absence of legal design and will propose ways towards sustainable compliance and meaningful transparency.

Christina Varytimidou,Data Protection and Digital Tech Associate,KBVL (Deloitte Legal),Greece. The views expressed in this article are the authors. For Correspondence: <cvarytimidou@outlook.com>

Share


Lx-Number Search

A
|
(e.g. A | 000123 | 01)

Export Citation