States Rush to Combat AI-Generated Deepfake Nudes Targeting Minors

Sad desperate young girl suffering from bulling and harassment at school - stock photo
iStock/Getty Images

In response to the growing issue of boys using AI apps to create and share sexually explicit images of their female classmates, state legislators across the United States are introducing bills to protect minors from this new form of exploitation. Meanwhile, Silicon Valley continues to make billions with the very AI models boys are exploiting.

The New York Times reports that the rise of AI-powered “nudification” apps has led to a disturbing trend in schools nationwide, where male students are using these tools to generate fake nude images of their female peers — essentially child pornography — and circulating them via social media and messaging apps. This alarming phenomenon has prompted lawmakers in at least two dozen states to take action, introducing legislation to combat the spread of AI-generated sexually explicit images, also known as deepfakes, depicting minors.

schoolkids using smartphones

schoolkids using smartphones ( dolgachov/Getty)

One such incident occurred at Issaquah High School near Seattle, where a male student circulated AI-generated nude images of girls who had attended a homecoming dance. Caroline Mullet, a ninth-grader at the school, brought the issue to the attention of her father, Washington State Senator Mark Mullet. Together with Representative Tina Orwall, they proposed legislation to prohibit the sharing of AI-generated sexually explicit depictions of real minors in their state.

“I hate the idea that I should have to worry about this happening again to any of my female friends, my sisters or even myself,” Ms. Mullet told state lawmakers during a hearing on the bill in January.

Several states, including South Dakota and Louisiana, have already enacted laws criminalizing the possession, production, and distribution of AI-generated sexual abuse material depicting minors. These laws aim to address the unique challenges posed by deepfakes, as existing statutes covering child sexual abuse material or adult nonconsensual pornography may not adequately protect victims of AI-generated explicit images.

Federal bills to make it a crime to disclose AI-generated intimate images of identifiable adults or minors and to enable victims to bring civil cases against perpetrators have also been introduced. However, these bills do not explicitly give victims the right to sue the developers of AI nudification apps, a step that trial lawyers believe would help curb the mass production of sexually explicit deepfakes.

As states work to pass laws targeting exploitative AI images, the consequences for offenders can be severe. Under Louisiana’s new law, individuals who knowingly create, distribute, promote, or sell sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years. In Miami-Dade County, two middle school boys were arrested and charged with third-degree felonies for allegedly making and sharing fake nude AI images of two female classmates.

Read more at the New York Times here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

Authored by Lucas Nolan via Breitbart April 22nd 2024