HLHR.org
Tuesday, January 31, 2023
  • HOME
  • WHO WE ARE
  • TAKE ACTION
  • NEWS
  • RESEARCH
  • IMPACT
  • RESOURCES
No Result
View All Result
  • HOME
  • WHO WE ARE
  • TAKE ACTION
  • NEWS
  • RESEARCH
  • IMPACT
  • RESOURCES
No Result
View All Result
hlhrorg
No Result
View All Result
Home Important

An urgent wake-up call to ban Dutch racist algorithms

An urgent wake-up call to ban Dutch racist algorithms

October 28, 2021
0
An urgent wake-up call to ban Dutch racist algorithms

Human Lives Human Rights: The continued use of unregulated algorithms in the public sector by the Dutch government risks exacerbating racial discrimination, a new analysis of the country’s childcare benefit scandal says.

The report Xenophobic Machines exposes how racial profiling was baked into the design of the algorithmic system used to determine whether claims for childcare benefit were flagged as incorrect and potentially fraudulent.

READ ALSO

Racial Discrimination to be reviewed in Bahrain, Botswana, Brazil, France, Georgia and Jamaica

Tens of thousands of parents and caregivers from mostly low-income families were falsely accused of fraud by the Dutch tax authorities as a result, with people from ethnic minorities disproportionately impacted.

While the scandal brought down the Dutch government in January, sufficient lessons have not been learnt despite multiple investigations.

Thousands of lives were ruined by a disgraceful process which included a xenophobic algorithm based on racial profiling.

The Dutch authorities risk repeating these catastrophic mistakes as human rights protections are still lacking in the use of algorithmic systems.

Alarmingly, the Dutch are not alone. Governments around the world are rushing to automate the delivery of public services, but it is the most marginalized in society that are paying the highest price.

Human Rights organizations call on all governments to implement an immediate ban on the use of data on nationality and ethnicity when risk-scoring for law enforcement purposes in the search of potential crime or fraud suspects.

From the start, racial and ethnic discrimination was central to the design of the algorithmic system introduced in 2013 by the Dutch tax authorities to detect incorrect applications for child benefits and potentially fraud.

The tax authorities used information on whether an applicant had Dutch nationality as a risk factor and non-Dutch nationals received higher risk-scores.

Parents and caregivers who were selected by the system had their benefits suspended and were subjected to hostile investigations, characterized by harsh rules and policies, rigid interpretations of laws, and ruthless benefits recovery policies.

This led to devastating financial problems for the families affected, ranging from debt and unemployment to forced evictions because people were unable to pay their rent or make payments on their mortgages. Others were left with mental health issues and stress on their personal relationships, leading to divorces and broken homes.

The design of the algorithm reinforced existing institutional bias of a link between race and ethnicity, and crime, as well as generalizing behavior to an entire race or ethnic group.

These discriminatory design flaws were reproduced by a self-learning mechanism that meant the algorithm adapted over time based on experience, with no meaningful human oversight.

The result was a discriminatory loop with non-Dutch nationals flagged as potentially committing fraud more frequently than those with Dutch nationality.

When an individual was flagged as a fraud risk, a civil servant was required to conduct a manual review but was given no information as to why the system had generated a higher-risk score.

Such opaque ‘black box’ systems, in which the inputs and calculations of the system are not visible, led to an absence of accountability and oversight.

The black box system resulted in a black hole of accountability, with the Dutch tax authorities trusting an algorithm to help in decision-making without proper oversight.

A perverse incentive existed for the tax authorities to seize as many funds as possible regardless of the veracity of the fraud accusations, as they had to prove the efficiency of the algorithmic decision-making system.

Parents and caregivers who were identified by the tax authorities as fraudsters were for years given no answers to questions about what they had done wrong.

The findings of Xenophobic Machines are to be presented at a United Nations General Assembly side event on algorithmic discrimination on 26 October.

Tags: Dutch governmentMerel Koningracial discriminationSenior AdvisorXenophobic Machines

Related Posts

Racial Discrimination to be reviewed in Bahrain, Botswana, Brazil, France, Georgia and Jamaica
Important

Racial Discrimination to be reviewed in Bahrain, Botswana, Brazil, France, Georgia and Jamaica

November 12, 2022

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

84 + = 93

POPULAR NEWS

Palestinian detainees cross 100 days of boycotting Israeli courts

Palestinian detainees cross 100 days of boycotting Israeli courts

April 15, 2022
Analysis of Taliban offensive and necessary action

Analysis of Taliban offensive and necessary action

April 11, 2022
Taliban rule accompanied by killings and abuses

Taliban rule accompanied by killings and abuses

April 17, 2022
What was behind the death of Qatar's migrant workers?

What was behind the death of Qatar’s migrant workers?

April 21, 2022
Rights groups demand whereabouts of detained Egyptian journalist

Rights groups demand whereabouts of detained Egyptian journalist

April 24, 2022

EDITOR'S PICK

Hamid Nouri's case contradicts with Sweden's human rights defender claims

Hamid Nouri’s case contradicts with Sweden’s human rights defender claims

May 10, 2022

Budapest renaming streets around Chinese university to highlight Beijing’s rights abuses

June 3, 2021

Toture is routine in Israeli prisons

May 5, 2021
Pakistan's Prime Minister calls for global dialogue to tackle Islamophobia

Pakistan’s Prime Minister calls for global dialogue to tackle Islamophobia

September 26, 2021
HLHR.org

About HLHR

Ours is a familiar story. A group of friends from all over the world have come together to chase the dream many others have chased before and are religiously working to fulfill; a world where human lives and human rights are treasured.

Recent Posts

  • After deadly Jenin raid, Israel again launches air strikes on Gaza
  • As world ages, UN calls for re-thinking on social protection
  • Baku summit – A summit held behind the curtains
  • A glance at the human rights situations since January 2023
  • Indonesian President regrets past violations

Categories

No Result
View All Result

© 2021 HLHR.ORG All rights reserved.

No Result
View All Result
  • HOME
  • WHO WE ARE
  • TAKE ACTION
  • NEWS
  • RESEARCH
  • IMPACT
  • RESOURCES

© 2021 HLHR.ORG All rights reserved.