Discussion
Police used AI facial recognition to arrest a Tennessee woman for crimes committed in a state she says she’s never visited
jqpabc123: AI is a liability issue waiting to happen. And this is just another example.
garyfirestorm: It’s a tool. Used incorrectly will lead to errors. Just like a hammer, used incorrectly could hit the users finger.
happytoexplain: There is enormous variability in how hard a tool is to use correctly, how likely it is to go wrong, and how severe the consequences are. AI has a wide range on all those variables because its use cases vary so widely compared to a hammer.The use case here is police facial recognition. Not hitting nails. The parent wasn't saying "AI is a liability" with no context.
suzzer99: Dynamite is a tool. But we don't hand it out to anyone who wants to play with it.
mikkupikku: We used to until quite recently. Anybody could buy dynamite at the hardware store. We had to end this because of criminals using it to hurt people.
jqpabc123: Look for AI to follow a similar trajectory over time.
mikkupikku: When somebody uses a tool to hurt somebody, they need to be held accountable. If I smack you with a hammer, that needs to be prosecuted. Using AI is no different.The problem here is incidental to the tool; it was done by the cops and therefore nobody will be held accountable.
jqpabc123: Only one small little problem --- there is no way to tell if you are using it "correctly".The only way to be sure is to not use it.Using it basically boils down to, "Do you feel lucky"?The Fargo police didn't get lucky in this case.
skeeter2020: AI feels closer to a firearm than a hammer when accessing law enforcement's ability to quickly do massive, unrecoverable harm.
GaryBluto: Impossible at this point. You cannot download dynamite.
tgv: This tool, however, is specifically built for mass surveillance. It serves no other purpose. The tool is broken, and everybody knows it. The tool makers are at least as guilty as those who use it.
cyanydeez: The tool, like Google search, is likely biased towards returning results regardless of confidence.
casey2: Now cruel people wield a two-tiered shield. It's not an accident that this happened to a woman, but make no mistake they are coming for men next.
jstanley: You think they deliberately chose to do this to a woman? Why?
cyanydeez: Probably just reading the room, with States like texas making abortions illegal and allowing random citizens from enforcing that.Famously, abortions are a woman thing.Anyway, looking through the facts, it's just some random woman. There's better evidence that these facial recognition systems are much worse at minorities rather than genders.Interesting biases are own-gendeR: https://pmc.ncbi.nlm.nih.gov/articles/PMC11841357/Racial bias:https://mitsloan.mit.edu/ideas-made-to-matter/unmasking-bias...Miss rates:https://par.nsf.gov/servlets/purl/10358566Although you can probably interpret the facts differently, we've seen how any search function gets enshittified: Once people get used to searching for things, they tend to select something that returns results vs something that fails to return results.Rather than the user blaming themselves, they blame the searcher. As such, any search system overtime will bias towards returning search (eg, Outlook), rather than accuracy.So if these systems easily miss certain classes of people, women, minorities, they'll more likely be surfaced as inaccurate matches rather than men who'll have a higher confidence of being screened out.That's how I interpret this 2 second commment.
gtowey: It's the opposite, it's absolution from liability. "The AI did it" is the ultimate excuse to avoid accepting responsibility and consequences.
jqpabc123: Courts are already refusing to accept this excuse.https://pub.towardsai.net/the-air-gapped-chronicles-the-cour...
MattDaEskimo: What kind of outcome results from misuse? Clearly a hammer's misuse has very little in common with a global, hivemind network used in high-stake campaigns.Now, if I misused a hammer and it hurt everyone's thumb in my country, then maybe what you said would have some merit.Otherwise, I'd say it's an extremely lazy argument
jfengel: Now the "qualified" immunity kicks in.
jqpabc123: We will find out. But relying on AI is likely to cost the Fargo police in one way or another.https://www.lawlegalhub.com/how-much-is-a-wrongful-arrest-la...
oopsiremembered: Money quote from someone quoted in the article:"[I]t’s not just a technology problem, it’s a technology and people problem."I can't. I just can't.
nkrisc: Some basic investigatory police work (the kind they did before AI) would have revealed the mistake before an innocent woman’s life was destroyed.
mikkupikku: Yes, regulation is inevitable.
jfengel: Regulation is impossible. The AI barons literally control the federal government, so not even state regulations get tried.
jqpabc123: Yes. But doing the investigation negates much of the incentive for using AI.Look for similar to play out elsewhere --- using unreliable tools is not a good, responsible business plan. And lawyers are just waiting to press the point.
bornfreddy: AI can provide leads. Someone still needs to verify them and decide.
tlogan: This is a weak or misleading story about AI.First, the detective used the FaceSketchID system, which has been around since around 2014. It is not new or uniquely tied to modern AI.Second, the system only suggests possible matches. It is still up to the detective to investigate further and decide whether to pursue charges. And then it is up to court to issue the warrant.The real question is why she was held in jail for four months. That is the part that I do not understand. My understanding is that there is 30-day limit (the requesting state must pick up the defendant within 30 day). Regarding the individual involved, Angela Lipps, she has reportedly been arrested before, so it is possible she was on parole. So maybe they were holding her because of that?Can someone clarify how that process works?
suzzer99: I admit I was surprised to see you could buy dynamite in a hardware store until 1970.
suzzer99: In the US there are no consequences for people in power failing to follow procedures, laws or regulations - except for being told to stop doing whatever illegal thing they're doing, and possibly getting sued way down the line, which gets paid by taxpayers.
suzzer99: I would say much more likely that it was because she was poor and couldn't afford a good lawyer.
IncreasePosts: What? Women are much more sympathetic figures when it comes to crime and punishment. And there are 10x more men in prison in america than women. If you were trying to "introduce" some nefarious law enforcement system to the US you would use it on undesirable men first (drug addicts and gang members)