A judge in Washington state has blocked video evidence that’s been “AI-enhanced” from being submitted in a triple murder trial. And that’s a good thing, given the fact that too many people seem to think applying an AI filter can give them access to secret visual data.
No computer algorithm can accurately reconstruct data that was never there in the first place.
Ever.
This is an ironclad law, just like the speed of light and the acceleration of gravity. No new technology, no clever tricks, no buzzwords, no software will ever be able to do this.
Ever.
If the data was not there, anything created to fill it in is by its very nature not actually reality. This includes digital zoom, pixel interpolation, movement interpolation, and AI upscaling. It preemptively also includes any other future technology that aims to try the same thing, regardless of what it’s called.
No computer algorithm can accurately reconstruct data that was never there in the first place.
Okay, but what if we’ve got a computer program that can just kinda insert red eyes, joints, and plums of chum smoke on all our suspects?
Imagine a prosecution or law enforcement bureau that has trained an AI from scratch on specific stimuli to enhance and clarify grainy images. Even if they all were totally on the up-and-up (they aren’t, ACAB), training a generative AI or similar on pictures of guns, drugs, masks, etc for years will lead to internal bias. And since AI makers pretend you can’t decipher the logic (I’ve literally seen compositional/generative AI that shows its work), they’ll never realize what it’s actually doing.
So then you get innocent CCTV footage this AI “clarifies” and pattern-matches every dark blurb into a gun. Black iPhone? Maybe a pistol. Black umbrella folded up at a weird angle? Clearly a rifle. And so on. I’m sure everyone else can think of far more frightening ideas like auto-completing a face based on previously searched ones or just plain-old institutional racism bias.
just plain-old institutional racism bias
Every crime attributed to this one black guy in our training data.