- IBM CEO said that the company no longer offers general purpose IBM facial recognition or analysis software. The move comes after George Floyd's death.
- IBM has taken this stance as the facial recognition systems could potentially be used to target minorities.
- Facial recognition systems are more prone to error when it comes to darker skins, as per reports.
Tech company IBM (International Business Machines Corporation) will stop selling facial recognition for racial profiling and mass surveillance. The move comes after George Floyd's death at the hands of a Minneapolis police officer. The debate over facial recognition in law enforcement has gained traction in the recent event of protests and demonstrations in the US.
IBM Chief Executive Officer, Arvind Krishna, in a letter addressed to the US Congress said the company decided to back out of the business "to work with Congress in pursuit of justice and racial equity focused initially in three key policy areas: police reform, responsible use of technology, and broadening skills and educational opportunities."
The company which is over a hundred years old has taken this stance as the facial recognition systems could potentially be used to target minorities or violate human rights.
In the letter, Krishna wrote, "IBM no longer offers general purpose IBM facial recognition or analysis software. IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms."
"Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe but vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity [sic] when used in law enforcement, and that such bias testing is audited and reported," he added.
According to a report by Datanami, facial recognition was blocked in some cities in the US back in 2019. California has a Body Camera Accountability Act which bans facial recognition technology from being used on body cameras worn by police.
Facial recognition systems work on the datasets that are used to train it. The report states that many of these data sets lack ethnic diversity. This creates a problem when technology is applied to people who do not look like the people in the training set. This makes the accuracy for the system to work on dark-skinned people fairly less than that of lighter skins.
An MIT Media lab graduate revealed that the error rates for gender classification algorithms for darker-skinned women exceeded 34 per cent while the maximum error rate for lighter-skinned males was less than 1 per cent. According to the report, he used facial recognition technology from Microsoft, IBM, and China's Face Plus Plus to draw these conclusions.
IBM will allow the sale or usage of facial recognition systems for certain purposes. The sale is allowed for specific purposes as well as to re-sell the same technology from other vendors as part of its large consulting business, a report by The Guardian stated.