Nate Freed Wessler
@NateWessler
Followers
3K
Following
61
Media
13
Statuses
746
Deputy director of @ACLU Speech, Privacy, and Technology Project. Views expressed here are my own.
Joined June 2012
Congratulations to reporter @dmac1 for important and original reporting on a pervasive but hard-to-quantify problem. Secrecy in the criminal enforcement system can be extremely hard to root out. The WashPost article connects the dots in a powerful way.
0
0
4
@washingtonpost @ACLU Face recognition tech is dangerous in a number of ways, and lawmakers’ best option is to bar police from using it. But at a bare minimum, the government must inform people when FRT has been used against them. Failing to do so can have dire results.
1
0
2
@washingtonpost @ACLU Prosecutors must disclose that information to defendants. In the language of the Supreme Court's decision in Brady v. Maryland, information about use of FRT is “material” to the defendant’s case, even if police don’t intend to introduce the FRT result as evidence in court.
1
0
1
@washingtonpost @ACLU That’s why a major component of the pathbreaking settlement agreement in the Robert Williams wrongful arrest case requires police to keep records about every use of face recognition tech, and to send those records to the prosecutor in any case that results in criminal charges.
1
0
0
@washingtonpost @ACLU When police hide their use of face recognition technology from the defense, it strips people of the ability to explain why a photo lineup or other identification procedure is tainted by the particular failures of the technology and is therefore unreliable.
1
0
0
@washingtonpost @ACLU That’s what happened to our client Robert Williams, and to others like Porcha Woodruff, Michael Oliver, and Nijeer Parks. You can read about Mr. Williams’s case, which resulted in a landmark settlement with the Detroit Police Department, here:
aclu.org
This case seeks to hold Detroit police accountable for the wrongful arrest of our client due to officers’ reliance on a false match from face recognition technology.
1
1
0
@washingtonpost @ACLU The witness is presented with an image of an innocent doppelganger. It’s no surprise that in case after case, witnesses have incorrectly said the innocent person flagged by face recognition tech looks like a match to the suspect. The FRT result taints the investigation.
1
0
1
@washingtonpost @ACLU The known face recognition tech wrongful arrests have mostly involved police "confirming" the FRT result with a photo lineup or other ID procedure. But because FRT is designed to find similar-looking faces, when it generates a false match it'll often be a lookalike to the suspect
1
0
1
@washingtonpost @ACLU One reason disclosure matters is that erroneous face recognition technology results can taint the rest of an investigation, even if the FRT result is never shown to a judge or jury. I wrote about that problem here:
aclu.org
Even when police heed warnings to take additional investigative steps, they exacerbate the unreliability of face recognition results.
1
2
2
@washingtonpost Police & prosecutors argue they are only required to inform criminal defendants of evidence that will be used at trial. They say they only use face recognition tech to generate a lead, not as evidence for trial. As @ACLU has explained, that's no excuse.
1
1
1
@washingtonpost To date, we know of at least 7 wrongful arrests due to police reliance on erroneous face recognition results. That is likely a radical undercount, and this reporting shows why. When police hide even the fact that they used this tech, the reason for a wrongful arrest stays secret
1
1
1
@washingtonpost This violates constitutional due process requirements, leaving people unable to adequately defend themselves against false identifications. Face recognition tech is unreliable and often produces false matches. Secrecy prevents people from showing where an investigation went wrong
1
2
3
Phenomenal reporting in @washingtonpost shows an epidemic of police hiding their use of face recognition technology from people accused of crimes and their defense attorneys. (1/x)
washingtonpost.com
A Post investigation found that many defendants were unaware of the technology’s role in linking them to crimes, leading to questions of fairness.
3
13
18
In Nov, a GA man was arrested for a crime in LA, a state he said he'd never been to. He spent 6 days in jail. We found his arrest was based on a wrong facial recognition match and supported by a cascade of technologies intended to make policing easier. https://t.co/HnL0POZGNq
nytimes.com
Because of a bad facial recognition match and other hidden technology, Randal Reid spent nearly a week in jail, falsely accused of stealing purses in a state he said he had never even visited.
19
334
675
For the last three months, @RMac18 & I have been trying to get to the bottom of exactly why a man in Georgia was jailed for stealing purses in Louisiana, a state he'd never been to. It started with facial recognition and snowballed from there. https://t.co/GQNEWUTlyE
15
199
387
BREAKING: We just released hundreds of documents showing how Arizona created a nationwide surveillance program to track Americans’ personal money transfers. It's one of the largest government surveillance programs in recent history.
328
7K
25K
“Ordinary people’s private financial records are being siphoned indiscriminately into a massive database, with access given to virtually any cop who wants it,” said @NateWessler of @aclu, which obtained files revealing the breadth of access among local, state and federal agencies
2
6
12
BREAKING: Today we asked the Supreme Court to hear Moore v. US as we challenge long-term, warrantless camera surveillance of people’s homes by police. We have a Fourth Amendment right to privacy — and we need a clear ruling to protect us against this invasive surveillance.
7
102
351
When States' try to pass biometric privacy laws, the constant refrain from opposition is: wHaT aBoUt SmAlL bUsInEsSeS? Today, in @PressHerald, two bookstore owners dispelled the myth that small businesses are against Maine's biometric privacy bill.
pressherald.com
It’s in the interest of Big Tech to defeat L.D. 1945, which would create guardrails on how companies can use our personal identifiers – our fingerprints, faces and voices.
1
5
8
We just filed public records requests asking the Phoenix field office of ICE’s Homeland Security Investigations and the Arizona attorney general for records regarding the illegal mass collection of people’s financial records.
5
63
206