The notion of “deepfake evidence,” which is evidence doctored by artificial intelligence, is not something from a science-fiction novel. Such falsified evidence can appear in a Georgia courtroom and, for that matter, anywhere in the world. Thanks to advancements in artificial intelligence, the altering of video, audio and text becomes possible.
In some instances, the altered material comes off as patently fraudulent. And there are items made with skill and precision. Such deepfake evidence could prove more compelling and believable.
In motion pictures, computer-generated images can make anything appear real onscreen. Creating fake evidence, however, does not require a multimillion-dollar budget to fool people. Merely altering a photograph could lead to criminal charges and even a possible conviction.
Imagine a photograph of someone leaving a crime scene. A shadowy figure’s face appears exposed in the image. After positive identification of the individual, criminal charges may follow. Such a course of legal events seems straightforward and difficult to refute in court. The problem here is the photo evidence isn’t real.
The reasons for going to such lengths to falsify evidence vary. Perhaps one person has much to gain from seeing another individual face criminal charges and a conviction. Think of a couple involved in a bitter, costly divorce. The division of assets in the divorce could be significant. For one party to spend significant sums to have a spouse’s face placed over an actor’s in a photo to “prove” stalking and trespassing could influence a court.
Of course, falsifying evidence is illegal and could cause the guilty party to have to spend time in prison. Likely, some believe they can get away with their acts. If the produced false evidence looks real enough, why wouldn’t it work?
For one reason, a criminal defense attorney could cast doubts on the evidence. If the accused has a clear-cut alibi, such as being at a business meeting with 10 other people in another town at the time of the alleged incident, the evidence could look weak. Perhaps a lawyer can undermine the validity of evidence by asking for corroborating proof, which never turns up. While deepfake evidence may be compelling, it isn’t necessarily foolproof.