Discussion
mvrckhckr: AI is a tool. It is humans who abdicate their responsibility (and thinking).
bradley13: Really, it's more about the police not doing their job. Face recognition pointed her out, the police saw she had a rap sheet, and therefore they didn't check further.She apparently could not afford a lawyer, who would have pointed out that she was provably at home (transactions, etc.) at the time the crime was committed in another state.Really it's not specifically AIs fault, though it made the error easier.
mkoubaa: Give them a hammer and everything becomes a nail
righthand: There's no better comparison to chimps with a gun than cops with technology.
mft_: Quite; AI contributed to a (criminally?) inept and negligent "justice" system ruining an innocent woman's life.The AI was akin to an unreliable eye-witness in this case, although people's trust in the AI's judgement may have been higher than a human eyewitness?
santoshalper: I still wouldn't let AI off the hook here. Every link in the chain has to be accountable for fuckups. You don't get to pass it along to the supposed "human in the loop" when you fail spectacularly. That's how we end up with shitty "almost works" AI.
odshoifsdhfs: But have they have tried the latest models? I understand this from October last year but Opus 4.6 is light and day and I wasn't a believer but now with this latest model it changed everything. it hasn't send any innocent person to jail yet and identified all my neighboorhood creeps 100%./s
righthand: I’m sure the cops got a slap on the wrist and their lives are fine. ACAB.
wat10000: Computers often serve as a tool for the avoidance of responsibility.
ahazred8ta: Ditto the 1982 Lenell Geter case -- he was sent to prison based on a faulty witness ID. https://www.LenellGeter.com/Content/About/ -- https://exonerationregistry.org/cases/4406
mft_: Sure, the AI contributed, but it was far less responsible overall than the humans in this case.Don't let the AI system off the hook by all means, but by focusing on it to this extent, the narrative ignores (deliberately?) the hugely negligent actions of the police et al involved.
jjj123: I agree, but I think the broader point here is that any automated system is a way to offload accountability. And it will be used for that without a doubt no matter how “good” the officers or human processes are.So it’s still reasonable to be skeptical of (or outright reject) the use of the technology in systems that can ruin or end people’s lives.
rectang: Howitzers are also tools, but we don't let just anyone own and operate them.
alcomatt: AI or more precisely the way it is being sold to us is the most responsible factor here. People by nature are lazy and will take shortcuts given an opportunity. AI is the ultimate shortcut these days, a "mental crutch" majority of the people using it are leaning on. Humans just did what they always do, be lazy - AI should never have been used for processes with this level of life-altering impact because what happened here was bound to happen.