- Home >
- Services >
- Access to Knowledge >
- Trend Monitor >
- Type of Threat or Opportunity >
- Trend snippet: National experts report that algorithms can cause direct and indirect discrimination
Trends in Security Information
The HSD Trendmonitor is designed to provide access to relevant content on various subjects in the safety and security domain, to identify relevant developments and to connect knowledge and organisations. The safety and security domain encompasses a vast number of subjects. Four relevant taxonomies (type of threat or opportunity, victim, source of threat and domain of application) have been constructed in order to visualize all of these subjects. The taxonomies and related category descriptions have been carefully composed according to other taxonomies, European and international standards and our own expertise.
In order to identify safety and security related trends, relevant reports and HSD news articles are continuously scanned, analysed and classified by hand according to the four taxonomies. This results in a wide array of observations, which we call ‘Trend Snippets’. Multiple Trend Snippets combined can provide insights into safety and security trends. The size of the circles shows the relative weight of the topic, the filters can be used to further select the most relevant content for you. If you have an addition, question or remark, drop us a line at info@securitydelta.nl.
visible on larger screens only
Please expand your browser window.
Or enjoy this interactive application on your desktop or laptop.
National experts report that algorithms can cause direct and indirect discrimination
3.2.2 Discriminatory effects Mostly as a result of in-built biases and stereotypes, national experts report that algorithms can easily cause direct and indirect discrimination. More specifically, algorithms can sometimes lead to sexist decisions, as was observed by the High Council on Equality of France in its report on sexism in the use of algorithms by the media and on the internet.357 In addition, it was mentioned that in France personalised price-setting for goods and services could cause discrimination by raising prices for gendered products for menstruation.358 Likewise, in Germany, the example has been given of the Berlin Public Transport Company offering targeted discounts to women on International Women’s Day, using facial recognition.359 This particular use relied on a binary distinction between men and women and was strongly based on stereotypes. It has been reported that personalised behavioural advertising in the real estate market in Italy can have the effect of discriminating according to ethnicity and social class, thus contributing to creating ‘new ghettos’ on the one hand, and luxury districts on the other.360 Another problem in Italy is that profiling through personal data can lead to the determination of credit ratings or insurance premiums according to lifestyle or driving, or the inclusion in certain ‘social clusters’ based on ethnic or geographical origin.361 Finally, the fraud detection system, SyRI, in the Netherlands might have a discriminatory and stigmatising effect to the extent that the system led to heightened supervision in certain ‘problem neighbourhoods’.362