Courts Are Facing an AI Tidal Wave
May 22, 2025
Here is my recent Daily Record column. My past Daily Record articles can be accessed here.
****
Courts Are Facing an AI Tidal Wave
If you haven’t noticed that artificial intelligence (AI) is already impacting the practice of law, then you’re not paying attention. Case in point: Judges are increasingly facing decisions about the admission of AI-created evidence at trial, and this is just the beginning. Courts will face a tsunami of AI-generated evidence, and I’m not sure we’re prepared for what’s coming.
That this is occurring isn’t surprising. The justice system has grappled with digital evidence for decades now. The first big change occurred in 2006 when the Federal Rules of Civil Procedure were amended to explicitly address electronic discovery. Another major trend was social media, which had a significant impact on trials as lawyers and judges struggled to keep up with the rapid influx of evidence mined from new and emerging social media sites.
AI is the next wave of technology that will leave its mark on trials. The difference is that it’s occurring at a much faster rate, and it has the potential to have far more dramatic impact on the administration of justice than any of the technologies that preceded it.
It’s already cropping up in courts across the country in many different contexts, and this is just the beginning.
Most recently, earlier this month, at an Arizona sentencing, the deceased victim of a road rage incident provided a victim impact statement from the victim in the form of an AI-generated video. The dialogue was written by his sister. The judge permitted the statement and considered it when sentencing he defendant to the maximum, which exceeded the recommendation of the prosecution.
In April, at an argument in front of the New York State Supreme Court Appellate Division’s First Judicial Department, a pro se litigant whose voice had been impacted by throat cancer attempted to argue his appeal via a video of an AI-generated lawyer. The Court quickly put a stop to it and ordered the defendant to proceed on his own.
In another case, State of Washington v. Puloka, the defense sought to introduce AI-enhanced videos of a bystander’s iPhone video taken of the alleged criminal incident in question. The judge declined to admit the videos into evidence, deeming them unduly prejudicial.
In a California case where the plaintiff was killed while riding in his Tesla set to auto-drive, his family sought to introduce a video of Elon Musk discussing the safety of that feature. The defense objected, claiming that the video could have been a deepfake. The court rejected that challenge.
In January, a Florida judge donned a VR headset in a criminal case in the 17th Judicial Circuit. The headset was provided by the defense in a stand-your-ground case and allowed the judge to “view” the wedding reception where the alleged assault occurred from the “perspective” of the defendant.
Finally, another notable intersection of judges and AI occurred just last week. It was reported that a newly elected Broward County Judge, Lauren Peffer, faces a judicial ethics complaint alleging that she promoted an AI-generated audio recording during her campaign that falsely depicted her opponent making sexually explicit and inappropriate remarks.
In other words, AI isn’t coming—it’s already here, and it’s changing how evidence is created, presented, and evaluated in courtrooms across the country. From deepfakes to virtual reality to synthetic speech, these tools are influencing decisions in ways we’ve never seen before.
Ready or not, the volume and complexity of AI-generated evidence will only grow. To maintain the integrity of our justice system, we need to start thinking now about how to address it, both practically and ethically. Otherwise, we risk letting untested, unreliable, or manipulative AI-generated outputs undermine the very system we rely on to deliver fair and just outcomes.
Nicole Black is a Rochester, New York attorney, author, journalist, and Principal Legal Insight Strategist at MyCase, CASEpeer, Docketwise, and LawPay, practice management and payment processing tools for lawyers (AffiniPay companies). She is the nationally-recognized author of "Cloud Computing for Lawyers" (2012) and co-authors "Social Media for Lawyers: The Next Frontier" (2010), both published by the American Bar Association. She also co-authors "Criminal Law in New York," a Thomson Reuters treatise. She writes regular columns for Above the Law, ABA Journal, and The Daily Record, has authored hundreds of articles for other publications, and regularly speaks at conferences regarding the intersection of law and emerging technologies. She is an ABA Legal Rebel, and is listed on the Fastcase 50 and ABA LTRC Women in Legal Tech. She can be contacted at [email protected].