- Home >
- Services >
- Access to Knowledge >
- Trend Monitor >
- Domain of Application >
- Trend snippet: The increasing use of algorithms in human resource recruitment processes means that algorithmic discrimination could present a risk for the labour market
Trends in Security Information
The HSD Trendmonitor is designed to provide access to relevant content on various subjects in the safety and security domain, to identify relevant developments and to connect knowledge and organisations. The safety and security domain encompasses a vast number of subjects. Four relevant taxonomies (type of threat or opportunity, victim, source of threat and domain of application) have been constructed in order to visualize all of these subjects. The taxonomies and related category descriptions have been carefully composed according to other taxonomies, European and international standards and our own expertise.
In order to identify safety and security related trends, relevant reports and HSD news articles are continuously scanned, analysed and classified by hand according to the four taxonomies. This results in a wide array of observations, which we call ‘Trend Snippets’. Multiple Trend Snippets combined can provide insights into safety and security trends. The size of the circles shows the relative weight of the topic, the filters can be used to further select the most relevant content for you. If you have an addition, question or remark, drop us a line at info@securitydelta.nl.
visible on larger screens only
Please expand your browser window.
Or enjoy this interactive application on your desktop or laptop.
The increasing use of algorithms in human resource recruitment processes means that algorithmic discrimination could present a risk for the labour market
Algorithmic discrimination, employment and platform work The increasing involvement of algorithms in human resources recruitment processes means that algorithmic discrimination could represent an important risk in the realm of the labour market. If uncorrected, algorithms trained on past data about promotions and recruitment will inevitably reproduce the current discriminatory status quo, thus disadvantaging legally protected groups.179 While such forms of algorithmic discrimination would most likely fall under the scope of EU gender equality and nondiscrimination law, algorithmic discrimination could be particularly pervasive in the context of platform work. However, the very applicability of the equal pay principle as well as further gender equality and nondiscrimination guarantees linked to employment and working conditions will depend on the existence of an employment relationship between de facto workers and platforms or goods and services providers.180