A 16-year-old Kentucky boy reportedly committed suicide shortly after he was blackmailed with AI-generated nude images, an increasingly common scheme known as “sextortion.”
Elijah Heacock of Glasgow, Kentucky, received a text including an AI-generated nude photo depicting himself and a demand that he pay $3,000 to prevent the image from being sent to family and friends, according to a report by KFDA.
On February 28, shortly after receiving the message, the teen died from a self-inflicted gunshot wound.
Elijah’s parents, John Burnett and Shannon Heacock, told CBS that they didn’t have a solid understanding of the circumstances that led to their son’s death until they found the messages on his phone.
Heacock said she now believes her son was a victim of a sextortion scheme.
“Sextortion is a form of child sexual exploitation where children are threatened or blackmailed, most often with the possibility of sharing with the public a nude or sexual images of them, by a person who demands additional sexual content, sexual activity or money from the child,” the National Center for Missing and Exploited Children (NCMEC) explains.
“This crime may happen when a child has shared an image with someone they thought they knew or trusted, but in many cases they are targeted by an individual they met online who obtained a sexual image from the child through deceit, coercion, or some other method,” the NCMEC continued.
“In many cases, the blackmailers may have stolen or taken images of another person and they are communicating through a fake account,” the organization added.
Elijah’s parents said they had never heard of sextortion until law enforcement began investigating their son’s death.
“The people that are after our children are well organized,” Burnett said. “They are well financed, and they are relentless. They don’t need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child.”
NCMEC says sextortion schemes have skyrocketed, revealing the organization has received more than 500,000 reports of sextortion against minors in just the last year.
Since 2021, at least 20 young people have committed suicide as a result of becoming victims of sextortion scams, according to the FBI.
Moreover, due to AI technology becoming increasingly sophisticated in recent years, creating fake nude images of targets has become easier, with scammers realizing they don’t even need to obtain actual nude photos of their victims in order to pull off a sextortion scheme.
NCMEC says that this year alone, more than 100,000 sextortion reports the organization received involved AI-generative images.
As Breitbart News previously reported, the Trump administration is fighting against sextortion schemes.
On May 19, President Donald Trump signed the Take It Down Act into law, making it a federal crime to post both real and fake sexually explicit material of another person to the internet without their consent.
“This legislation is a powerful step forward in our efforts to ensure that every American, especially young people, can feel better protected from their image or identity being abused through non-consensual intimate imagery,” First Lady Melania Trump said.
The Take It Down Act also requires social media companies and websites to remove sexually explicit material within 48 hours of a victim’s request.
Elijah’s parents said they hope the legislation will make a difference, as they do not want other families to go through what they have experienced.
“It’s kind of like a bullet in a war. It’s not going to win the war,” Burnett said. “No war is ever won by one bullet. You got to win battles. You got to win fights. And we’re in it.”
Alana Mastrangelo is a reporter for Breitbart News. You can follow her on Facebook and X at @ARmastrangelo, and on Instagram.