Facial Recognition Software: A History of Inaccuracy and Injustice

The history of facial recognition programs has been marred by a troubling pattern of false matches, particularly when it comes to nonwhite populations. A recent distressing case shines a light on the inherent flaws of this technology. The New York Times reported a harrowing incident involving Porcha Woodruff, a visibly pregnant woman who was wrongfully arrested and detained for over 11 hours by the Detroit Police Department due to a robbery and carjacking she had no involvement in.

Injustice Unveiled: The Woodruff Case

The incident unfolded on February 16, leading to Woodruff’s subsequent arrest. Despite her evident pregnancy and strong claims that she could not have physically committed the alleged crimes, six police officers handcuffed her in front of her children and neighbors. Notably, the surveillance footage of the January 29 robbery clearly showed a non-pregnant woman. Woodruff’s ordeal extended beyond her arrest, as she reportedly suffered contractions and back spasms while in custody, requiring intravenous fluids at a local hospital due to dehydration. She was released on a $100,000 personal bond later that night, and her charges were eventually dismissed due to insufficient evidence.

A Pattern of Misidentification

Woodruff’s case is not an isolated incident. According to the ACLU, she is the sixth known individual, and the first woman, to report being falsely accused of a crime due to facial recognition inaccuracies. Tragically, each person falsely accused in these instances has been Black. The ACLU has been actively involved in representing those who have experienced these injustices, including a man suing the Detroit Police Department for a similar incident involving facial recognition biases in 2020. This marks the third allegation of wrongful arrest related to the DPD in recent years.

A Call for Change and Accountability

The ACLU’s Phil Mayor emphasized the urgent need for reform, stating, “It’s deeply concerning that the Detroit Police Department knows the devastating consequences of using flawed facial recognition technology as the basis for someone’s arrest and continues to rely on it anyway.” Mayor’s sentiment reflects the pressing demand for an end to the use of this technology, which has perpetuated injustices and eroded trust in law enforcement.

Expanding Impact: Beyond Law Enforcement

False facial scan results extend beyond law enforcement. Instances of misidentification have surfaced in various settings, highlighting the systemic bias ingrained within the technology. In 2021, a Detroit roller skating rink wrongly identified a Black teenager as someone banned from the establishment, underscoring the risks of unchecked applications. Additionally, public housing officials have employed facial ID technology to surveil and evict residents, often with little oversight, exposing the far-reaching consequences of unchecked use.

Looking Forward: A Call for Accountability and Reform

The unsettling reality of facial recognition software’s inaccuracies demands immediate action. The Woodruff case and others like it underscore the need for accountability, transparency, and comprehensive reform to ensure that this technology, which wields significant power, does not perpetuate injustices or violate individual rights.

By Impact Lab