“Face recognition and other forms of biometric identification technologies are a threat to core constitutional rights and have a disturbing record of racial bias and inaccuracy that endangers people of color and other marginalized groups,” argues the organization. “In keeping with President Biden’s commitment to racial equity and civil liberties for all, he must take immediate action to halt the use and funding of these dangerous technologies by the federal government.”
To that end, a group of more than 35 signatories penned a letter to the President urging “swift action” as the ACLU expressed concerns, specifically that the technology has major shortcomings with respect to individuals with darker skin tones, women, LGBTQ people and young people.
“This bias already has caused irreparable harm. Robert Williams, Michael Oliver, and Nijeer Parks are all Black men wrongly arrested and incarcerated after police falsely identified them using a face recognition system,” reads the letter. “While disturbing, these wrongful arrests of Black men are not surprising.”
FROM TWITTER
Los Angeles Times @latimes Feb 11
"Amazon is facing questions from senators over a reported contract with Dahua, a Chinese camera company that indicated it has the ability to alert police when its facial recognition software identifies members of the Uighur ethnic group.
@JMBooyah reports: https://www.latimes.com/business/technology/story/2021-02-10/senators-amazon-dahua-inquiry-uighurs-rubio-menendez
According to the letter, the impetus for the call to action includes a number of scholarly initiatives—including one from Joy Buolamwini, a Black doctoral candidate at the Massachusetts Institute of Technology—that found “commercially available facial recognition systems did not detect her face until she placed a white mask over it.”
“In her landmark 2018 study, Buolamwini and her colleagues reported alarming racial and gender disparities in a range of facial recognition technologies marketed by some of the most prominent technology companies in the world,” it continues.
The ACLU specifically points to Amazon’s Rekognition software as flawed, and along with 80 other organizations, called on Amazon, Microsoft and Google to stop selling their recognition tech to law enforcement agencies.
In June, Amazon announced its own one-year moratorium on police usage of Rekognition, asking governments to take time to “put in place stronger regulations to govern the ethical use of facial recognition technology.” However, it did also announce it will continue to allow groups combating human trafficking to continue to use the tech.
Privacy rights, technology and the law also intersected separately in recent weeks as Virginia is reportedly on the cusp of passing its own Consumer Data Protection Act, which would make it one of only two states in the country to pass such legislation, according to Roll Call. If passed, the law would place them with California as one of two states to pass such a measure.
Per the legislation, which is awaiting the signature of Gov. Ralph Northam, the bill would establish a framework for processing and controlling individuals' personal data and outlines the protection standards and responsibilities for processors and controllers.
It applies to “all persons that conduct business in the Commonwealth and either (i) control or process personal data of at least 100,000 consumers or (ii) derive over 50% of gross revenue from the sale of personal data and control or process personal data of at least 25,000 consumers,” although there are some exceptions.
If passed, it would take effect Jan. 1, 2023.