You may have heard about the allegations that began in January 2024 against the principal of Pikesville High School in Baltimore County: an audio recording had caught him making racist and antisemitic comments about students, teachers and parents. News flash from late April 2024: it was a “deepfake” artificial intelligence (AI)-generated recording.
As the principal had maintained all along, he never made the comments and the voice in the recording was not his. Baltimore County Police vindicated the principal after detectives determined that the school’s former athletic director, who had been under investigation for theft of school funds and for firing a coach without approval, made the recording in retaliation against the principal. The athletic director was suspended, arrested and criminally charged.
In the intervening three months, however, as the audio recording spread via social media, defamation and threats of violence assailed the principal, discord and distrust profoundly impacted the school community, and police beefed up security at the high school. The principal had to take an administrative leave of absence.
What is a “deepfake”? They are artificially created audio files, videos and photos of people and events that look and sound real, but which never happened. As AI technology evolves, the ability of experts to detect deepfakes will become more challenging and deepfake detection technology will also have to improve.
Laws aimed at catching deepfakes and punishing those who made them will have to be enacted, and some states have begun doing so. For example, Tennessee just passed the “Elvis Act,” which makes it a crime to use someone’s voice in a deepfake for commercial purposes. It also provides both for a civil right of action against the perpetrator and for an action against any AI entity that “distributes, transmits, or otherwise makes available an algorithm, software, tool, or other technology, service, or device” that is used in creating the deepfake. Maryland had a few bills that were introduced in the legislature but did not make it across the governor’s desk before the end of the legislative session.
The authenticity of evidence in legal proceedings is paramount. To establish facts in a case, the parties may rely on photos, video or audio recordings. For these items to be admissible in a court of law, their veracity needs to be proven. Attorneys will have to be careful to make sure that the evidence their client presents is real. Experts may need to be retained to determine the veracity of legal exhibits.
Another area of the law in which AI is affecting cases is the drafting of legal papers. Some lawyers who have used AI in generating briefs and other court documents have been sanctioned for including fake case citations that the AI platform has generated. These attorneys should have checked the case citations to be sure that they were real cases. Supreme Court Chief Justice John Roberts Jr. discussed technology and the use of AI in his 2023 year-end report on the federal judiciary. From typewriters to computers to AI, technology has changed the way law is practiced. He acknowledged that “AI can help … (but) any use of AI requires caution and humility.”
Call David Diggs for actual intelligence, not artificial intelligence. If you find yourself the victim of a deepfake, you will have questions about what redress is available to you and other issues. You should consult with an attorney who will assist you in making informed decisions. If you need further information regarding this subject, contact the Law Office of David V. Diggs LLC, located at 8684 Veterans Highway, Suite 302, in Millersville, by calling 410-244-1189 or by emailing david@diggslaw.com.
Comments
No comments on this item Please log in to comment by clicking here