Is it inevitable that the “Homicide Prediction” algorithm of the United Kingdom is racist?

by Andrea
0 comments
Is it inevitable that the “Homicide Prediction” algorithm of the United Kingdom is racist?

Is it inevitable that the “Homicide Prediction” algorithm of the United Kingdom is racist?

Who is most likely to become a killer? UK will ask the technology. British “chilling” initiative, “Minority Report” style, threatens to be discriminatory.

“Minority Report” style, a film past 2054 and where Tom Cruise is head of the “pre-crime” police unit, the UK is preparing to ask an algorithm who is most likely to become a killer.

The project already has an official name – “Sharing Data to Improve Risk Assessment”which replaces “Homicide Prediction Project” – and will use technology to analyze thousands (100 to 500,000 people, allegedly) of individuals known to the criminal courts, to increase public safety through a better risk assessment, according to the Ministry of Justice, but is already provoking the alarm, especially between activists and the police, which describe it as it “Chilling” and “dystopian”.

The new UK government program will use this tool to identify individuals most likely to commit homicides. The government says they are advanced data science techniques such as reinforcing the current risk assessment tools already used by prison and parole services. It will still be, according to the ministry, in the test phase, and only uses data from individuals with at least one criminal conviction, but some say otherwise.

According to the civil liberties defense group, the data sharing agreement between the Ministry of Justice and the Manchester Police (GMP) includes information about people who have not been convicted of any crime, such as victims of domestic violence and individuals who self -milling or had mental health problems.

In fact, the project will be allegedly using data from various official sources, including the social reintegration service and GMP data collected before 2015. We are talking about names, birth dates, gender, ethnicity and identifiers linked to police databases, classified into official documents such as “special personal data categories”.

This information includes health -related indicators, such as toxicodependence, suicide, mental illness, self -project lesions and disability.

“Scary and alarming”

Many activists and researchers remember that the tool has the great potential to strengthen systemic prejudices, reaching ethnic minorities and communities with low income and, overall, to reinforce structural discrimination in the country.

“The Ministry of Justice’s attempt to build this homicide forecast system is the last example scary and dystopian of the government’s intention to develop so -called ‘prediction’ systems of crimes, ”said Sofia Lyall, a researcher at StateWatch

The investigation repeatedly shows that the “prediction” algorithmic systems of crime are intrinsically defective ”, he warns:“ This last model, which uses data from our police and our interior ministry, institutionally racist, will reinforce and expand structural discrimination underlying the criminal-criminal system ”.

“Like other systems of the genre, it will codify prejudice to racialized and low income communities. Building an automated tool to define people’s profile as violent criminals is deeply wrong, and using such sensitive data on mental health, dependence and disability is highly intrusive and alarming.”

The truth is that these algorithms are not righteous by nature: they are based on statistical probability, not real behavior.

There are many known cases that involved the application of technology and artificial intelligence with autonomy to make decisions in justice, policing and criminal investigation, and which have proven to be biased, unjust and discriminatory.

The black, detained in 2020 under false accusations by the Detroit police due to incorrect identification by facial recognition technology, comes to mind.

In some US courts the discussion was broader due to the use of, used to evaluate the risk of recurrence of the defendants and highly criticized for displaying clear racial bias.

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC