Florida teens arrested for creating ‘deepfake’ AI nude images of classmates

/

The arrests and felony charges may be the first related to the sharing of AI-generated explicit images.

p>span:first-child]:text-gray-13 [&_.duet–article-byline-and]:text-gray-13″>

a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Illustration by Haein Jeong / The Verge

Two Florida middle schoolers were arrested in December and charged with third-degree felonies for allegedly creating deepfake nudes of their classmates. A report by Wired cites police reports saying two boys, aged 13 and 14, are accused of using an unnamed “artificial intelligence application” to generate the explicit images of other students “between the ages of 12 and 13.” The incident may be the first US instance of criminal charges related to AI-generated nude images.

They were charged with third-degree felonies under a 2022 Florida law that criminalizes the dissemination of deepfake sexually explicit images without the victim’s consent. Both the arrests and the charges appear to be the first of their kind in the nation related to the sharing of AI-generated nudes. 

Local media reported on the incident after the students at Pinecrest Cove Academy in Miami, Florida, were suspended December 6th, and the case was reported to the Miami-Dade Police Department. According to Wired, they were arrested on December 22nd.

Minors creating AI-generated nudes and explicit images of other children has become an increasingly common problem in school districts across the country. But outside of the Florida incident, none we’d heard of have led to an arrest. There’s currently no federal law addressing nonconsensual deepfake nudes, which has left states tackling the impact of generative AI on matters of child sexual abuse material, nonconsensual deepfakes, or revenge porn on their own.

Last fall, President Joe Biden issued an executive order on AI that asked agencies for a report on banning the use of generative AI to produce child sexual abuse material. Congress has yet to pass a law on deepfake porn, but that could possibly change soon. Both the Senate and House introduced legislation, known as the DEFIANCE Act of 2024, this week, and the effort appears to have bipartisan support.

Although nearly all states now have laws on the books that address revenge porn, only a handful of states have passed laws that address AI-generated sexually explicit imagery to varying degrees. Victims in states with no legal protections have also taken to litigation. For example, a New Jersey teen is suing a classmate for sharing fake AI nudes. 

The Los Angeles Times recently reported that the Beverly Hills Police Department is currently investigating a case where students allegedly shared images that “used real faces of students atop AI-generated nude bodies.” But because the state’s law against “unlawful possession of obscene matter knowing it depicts person under age of 18 years engaging in or simulating sexual conduct” does not explicitly mention AI-generated images, the article says it’s unclear whether a crime has been committed.

The local school district voted on Friday to expel five students involved in the scandal, the LA Times reports.

This post was originally published on The Verge

Share your love