Not all negative outcomes are the result of faulty data. Additionally, it can work to suppress some groups within society, such as minorities and weaker members of society.

This is the central thesis of my latest book, which explores the connection between artificial intelligence (AI) and various forms of racism and sexism. The issue is severe. In order for algorithms to get better at whatever they do, like reviewing resumes or underwriting mortgages, they typically need to be exposed to data, which is frequently obtained through the internet.

However, a lot of the biases present in the real world are frequently present in the training data. For instance, computers may discover that the majority of persons who fill a given employment role are men, favoring applications from men.

A number of myths from the age of enlightenment have distorted our statistics, including prejudices that result in discrimination based on gender and sexual orientation.

It is common sense to assume that remnants of racist discrimination permeate our technology based on the history of societies where racism has contributed to the establishment of the social and political order, giving privileges to white males (such societies can be found in Europe, North America, and Australia, for example).

During my study for the book, I compiled a list of well-known instances. Black and Asian minorities were more frequently mistakenly recognized by face recognition software, which resulted in erroneous arrests in the US and worldwide.

Black criminals were anticipated by software utilized in the criminal justice system to have higher recidivism rates than they actually did. False healthcare choices have been made. According to a study, black individuals who were frequently sicker than their white counterparts were given the same health risk score by an algorithm employed in US health management.

As a result, more than half as many black patients were no longer classified as needing additional treatment. The algorithm incorrectly assumed that black patients were healthier than equally ill white patients since less money was spent on them despite having the same level of need as white patients. Biased data sets encourage the denial of mortgages to minorities.