Shannon Heacock told her 16-year-old son, Elijah, to go to bed early one night in February. There was a district basketball playoff the next day in their hometown of Glasgow, Ky.
Heacock coached the high-school cheer team. Elijah had made props and was planning to help her set up. At 10:24 that night, he texted her about getting coffee at the next day’s event.
An hour after Heacock silenced her phone and went to sleep, her daughter woke her up. Elijah had been found bleeding in the laundry room, from what turned out to be a self-inflicted gunshot wound. He died the next morning.
In that hour, Elijah had exchanged more than 150 text messages on his iPhone with someone else: a criminal who was demanding $3,000. If Elijah didn’t send the money, this person said, he would share a nude image of the teen with his friends and family. Heacock said she believes the image was AI-generated.
Adults, often outside the country, pose as teenage girls and befriend teen boys on social media, then push to move the conversation to Snapchat or get their phone numbers for texting. In the more private, one-on-one chat, the criminals send photos or videos of a nude teen girl or young woman, and encourage the boys to do the same. Apple’s Messages app has become an appealing venue to build trust in such “sextortion” schemes, say law-enforcement officers and child-safety experts.