- Home >
- Services >
- Access to Knowledge >
- Trend Monitor >
- Domain of Application >
- Trend snippet: The lack of specifications on human involvement in the General Data Protection Regulation is problematic when taking into account algorithmic bias
Trends in Security Information
The HSD Trendmonitor is designed to provide access to relevant content on various subjects in the safety and security domain, to identify relevant developments and to connect knowledge and organisations. The safety and security domain encompasses a vast number of subjects. Four relevant taxonomies (type of threat or opportunity, victim, source of threat and domain of application) have been constructed in order to visualize all of these subjects. The taxonomies and related category descriptions have been carefully composed according to other taxonomies, European and international standards and our own expertise.
In order to identify safety and security related trends, relevant reports and HSD news articles are continuously scanned, analysed and classified by hand according to the four taxonomies. This results in a wide array of observations, which we call ‘Trend Snippets’. Multiple Trend Snippets combined can provide insights into safety and security trends. The size of the circles shows the relative weight of the topic, the filters can be used to further select the most relevant content for you. If you have an addition, question or remark, drop us a line at info@securitydelta.nl.
visible on larger screens only
Please expand your browser window.
Or enjoy this interactive application on your desktop or laptop.
The lack of specifications on human involvement in the General Data Protection Regulation is problematic when taking into account algorithmic bias
The approach taken by EU data protection law, and in particular the GDPR, towards preventing discrimination is furthermore substantially different from that of non-discrimination law and pertains to the degree of automation of the processing of sensitive data. The GDPR considers the presence of a human in the loop as a form of preventive safeguard, as Recital 71 makes clear: ‘The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention. Such processing includes ‘profiling’ that consists of any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her.’ However, there is no clarity concerning the form of human supervision involved. This is problematic in view of the existence of so-called automation biases, as explained above.156 In fact, prohibiting full automation does not ensure the absence of discrimination. In addition, Article 5(1) of the GDPR clarifies that some of the principles underpinning the processing of personal data are ‘lawfulness, fairness and transparency’, while Article 5(2) mentions the principle of ‘accountability’. By contrast, the notion of ‘discrimination’ is only mentioned three times to describe the risks posed by the processing of sensitive personal data.157 In turn, ‘equality’ is only mentioned twice in relation to processing data in the context of employment.158 As a result, the concepts on which the GDPR relies in relation to the question of algorithmic discrimination are quite different from those central to gender equality and non-discrimination law and the link between the two areas is not made explicit by the GDPR. The approach taken by the GDPR to sensitive data offers some guarantees regarding some of the protected grounds covered by EU non-discrimination law, but also evidences gaps, not least in relation to the protection of gender equality. Despite different conceptual approaches to the issue of algorithmic discrimination, EU data protection law and in particular the GDPR can provide important complements to EU gender equality and nondiscrimination law.