This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Welcome to Reed Smith's viewpoints — timely commentary from our lawyers on topics relevant to your business and wider industry. Browse to see the latest news and subscribe to receive updates on topics that matter to you, directly to your mailbox.
| 1 minute read

Deepfake Distress

Ten years ago, if someone was discussing “deep fakes,” they might be discussing the threat of Tom Brady pump-faking a throw to Julian Edelman flying down the sidelines, before dumping the football off to Wes Welker underneath for a first down. However, today’s “deepfakes” raise an entirely different threat.

“Deepfake” is the common term for an audio/video recording that appears to show someone saying something that they didn’t actually say. By 2018, technology had advanced to the point where a non-expert could create convincing deepfakes in a matter of hours using freely-available software. Indeed, it was around that time that we began to see deepfakes start to appear on social media, including one retweeted by then President Trump purporting to show Nancy Pelosi slurring her words. More recently, in 2022, a deepfake video of Ukrainian President Volodymyr Zelenskyy circulated on social media, falsely showing Zelenskyy telling his soldiers to lay down their arms and surrender in the ongoing conflict with Russia. This linked article by Cassandre Coyer includes my comments about some impacts of deepfakes on our litigation system.

One answer is that lawyers have always been required to lay a proper foundation for any evidence they wish to present, and that includes authenticating the evidence. Opposing counsel have long been expected to challenge questionable evidence. Judges are required to perform a “gatekeeping” function to help ensure that fake evidence is not used. While none of those requirements have changed, meeting them has become more difficult now that we have to worry about deepfakes. Expect an increase in discovery aimed at the authenticity of digital evidence, and an increase in the retention of experts to try to differentiate true evidence from fake evidence. Yesterday’s handwriting experts are giving way to today’s and tomorrow’s forensic analysts.

As I noted in my comments to Ms. Coyer, the biggest negative impact may not be the extra vigilance required to evaluate audio/visual evidence – instead, it may be an increase in distrust of real evidence as the general public realizes that digital evidence is not very difficult to fake. In an era where “truth” is increasingly treated as subjective rather than objective, this should be an issue of great concern for all of us who participate in—or care about—our legal justice system.


ediscovery, deepfake, evidence, fake, authentication, digital, audio, visual, e-discovery