Face Recognition Software Led to His Arrest. It Was Dead Wrong

Alonzo Sawyer’s misidentification by algorithm made him a suspect for a crime police now say was committed by someone else—feeding debate over regulation.
Camera pointing with laser coming out.
ILLUSTRATION: JAMES MARSHALL; GETTY IMAGES

Carronne Sawyer took the week off work to get her husband Alonzo out of jail. She knew he was asleep on the couch with her at the time police alleged he assaulted a bus driver near Baltimore and stole their smartphone. But an intelligence analyst using face recognition software had labeled him a possible match with the suspect seen on CCTV footage from the bus, police records show, and an officer had confirmed it.

At a police station and in a meeting with her husband’s former parole officer, the person who had confirmed the software’s suggested match, Carronne drew attention to details in photos on her phone taken recently by her daughter. Her husband is taller than the suspect in the video, she explained, and has facial hair and gaps between his teeth. His right foot slews out when he walks, something she did not see in video footage of the attack.

“I said my husband is 54 years old. This guy looks like he could be our son,” Carronne says. Alonzo was eventually released after nine days in jail, she says, during which time he missed his wife’s Gladys Knight tribute show and his work as a barber, and could not complete a construction contract he had secured. “I’m just grateful I was able to do all the labor and running around, because had I not he would still be sitting there for something he didn’t do,” Carronne says. The Sawyers’ ordeal took place in spring 2022 but has not previously been reported.

Around the time Alonzo was released, the victim in the bus incident identified another man as the suspect in the video, Deon Ballard, who is 7 inches shorter and more than 20 years younger than Sawyer, according to charging documents. Ballard’s mother and a police officer who arrested him confirmed that identification, one document shows, and he is due to stand trial in April.

Maryland Transit Administration Police did not respond to repeated requests for comment and deputy state’s attorney for Baltimore County John Cox declined to confirm Ballard and Sawyer were arrested for the same crime. WIRED was unable to speak with Alonzo Sawyer, who is serving time in a Maryland jail on a charge unrelated to the bus incident.

The Alonzo Sawyer case adds to just a handful of known instances of innocent people getting arrested following investigations that involved face recognition misidentification—all have been Black men. Three cases came to light in 2019 and 2020, and another last month, in which Georgia resident Randal Reid was released from jail after a judge recalled an arrest warrant linking him to thefts of designer purses in Louisiana.

Carronne Sawyer recalled her husband’s experience in public this month, calling in to the Maryland State House by video chat to speak in support of a proposed law to restrict police use of face recognition. The technology is largely unregulated in the US, but a wave of local restrictions and even bans have been passed in recent years.

Debates that led to those policies have often focused on discussions of harms from police use of face algorithms, such as the chilling effects on free speech and protests, or the consequences of surveillance tools being disproportionately used against communities of color. In Baltimore, Sawyer’s case provided a more tangible reminder of the reasons to restrict the technology.

Charles Sydnor, a Maryland state senator for Baltimore County, says that learning of Alonzo Sawyer’s case in fall 2022 inspired him to reintroduce the senate version of the proposed bill regulating face recognition, after a version failed to pass last year. “Not only is it in Maryland, but it’s in my backyard, my home jurisdiction,” Sydnor says. “My suspicion is you may have some in law enforcement say, well, the man got freed in nine days, so the system works. If face recognition kicks off investigations that land innocent people in jail, there’s a problem.”

Sydnor has been trying to put guardrails on face recognition for years. In 2020 he introduced a bill that would have placed a one-year moratorium on state and local government use of the technology. False arrests of Black men after incorrect matches by face recognition software that began to come to light later that year brought a renewed sense of urgency.

Face recognition systems have a history of misidentifying people with dark skin, and more than 60 percent of Baltimore residents identify as Black. Sydnor says he feels a pressing need to get regulation into place to protect their rights. He pivoted to proposing restrictions on face recognition that fall short of a ban after concluding that the technology was too widespread for a ban to be practical.

Sydnor’s proposed bill and an equivalent introduced in the Maryland legislature’s other chamber, the House of Delegates, would limit police use of face recognition to cases involving violent crimes, human trafficking, or “ongoing threat to public safety or national security.” They would also restrict police to searching for face matches in only databases of driver’s license and mug shot photos, putting off-limits services like that of startup Clearview AI, which scraped billions of face images from the web, including from social media.

The bills also require annual reports detailing police use of the technology, proficiency tests for the human analysts who pick possible matches from a list chosen by an algorithm, and police to have evidence beyond just a face recognition match to make an arrest.

Sydnor concedes that the proposed bill may not prevent the next case like that of Alonzo Sawyer, but he hopes it will still lead to better outcomes. “This bill was introduced as a compromise. It certainly isn’t as strong as I wanted it to be,” Sydnor says. “They’re not going to stop using [face recognition]. So long as there’s nothing in place, they’re going to continue using it unregulated.”

The proposed Maryland bills were developed with input from a working group that saw state lawmakers meet with prosecutors and public defenders, law enforcement agencies, and civil liberties groups like the ACLU and the Innocence Project.

Maryland is a unique place to debate face recognition regulation, says Andrew Northrup, an attorney in the forensics division of the Maryland Office of the Public Defender. He calls Baltimore “a petri dish for surveillance technology,” because the city spends more money per capita on police among 72 major cities in the US, according to a 2021 analysis by the nonprofit Vera Institute of Justice, and has a long history of surveillance technology in policing.

The use of invasive surveillance technology including face recognition in Baltimore during protests following the 2015 death of Freddie Gray led former House Oversight and Reform Committee chair Elijah Cummings to interrogate the issue in Congress. And in 2021, the Baltimore City Council voted to place a one-year moratorium on face recognition use by public and private actors, but not police, that expired in December.

Northrup spoke in favor of the bill and its requirement for proficiency testing at the same House of Delegates Judiciary Committee hearing addressed by Carronne Sawyer this month. He warned that as use of the technology becomes more common, bad face recognition could replace bad eyewitness identification as a major source of wrongful convictions. Most people are bad at recognizing strangers, Northrup says, even when assisted by an algorithm.

Organizations representing Maryland police and prosecutors participated in the formation of the proposed bill through the working group but have still raised opposition. In the Judiciary Committee hearing, Maryland Chiefs of Police Association president Russ Hamill said that what happened to Alonzo Sawyer was horrifying, but he spoke in opposition to the bill. He said it too tightly restricted the type of cases in which face recognition could be used and also complained about its limitations on which photo databases police can search.

Nick Picerno, a police captain for Montgomery County, an urban area near Washington, DC, also said those parts of the bill would hinder law enforcement. He  said officers in his department have previously used the technology to identify an indecent exposure suspect caught on a doorbell camera and to identify a child abuse victim in a TikTok video. He asked that the proposal be modified to allow use of face recognition to identify both suspects and witnesses in many more categories of crime, including firearm possession, child pornography, domestic violence, and cruelty to animals.

Deborah Levi, a public defender in Baltimore, told the hearing that her public records requests indicated that the Baltimore Police Department alone used face recognition more than 800 times in 2022. In one case, police ran an Instagram photo of a person holding a gun through face recognition software, then secured a no-knock warrant for the address of the person suggested as a match, she said.

Carronne Sawyer supports the proposed law because she believes it stipulation that face recognition “may not serve as the sole basis for positive identification” would have made a difference in her husband’s case. His ordeal changed how she feels around police and took away her faith in due process, she says, leaving her convinced that society urgently needs regulation like that under discussion in Maryland.

“I’m just thinking about how many other people have gone through what my husband had to go through and didn’t have anybody to fight for them,” she says. “How many people are sitting in jail now for something they didn’t do because of facial recognition and law enforcement agencies not doing their due diligence?”

The Maryland state legislature adjourns in April and won’t meet again until January 2024. If the proposed bills do not pass before then, police use of face recognition will remain unregulated in the state for at least another year.