Canadian police are on ‘high alert’ for asylum-seekers after Trump’s win
- today, 2:20 PM
- fastcompany.com
- 0
In the excruciating hours after her 17-year-old son Jordan DeMay was found dead of an apparent suicide in March of 2022, Jennifer Buta wracked her brain for an explanation.
“This is not my happy child,” Buta remembers thinking, recalling the high school football player who used to delight in going shopping with his mom and taking long walks with her around Lake Superior, not far from their Michigan home. “I’m banging my head asking: What happened?”
It wasn’t long before Buta got her answer: Shortly before he died, DeMay had received an Instagram message from someone who appeared to be a teenage girl named Dani Robertts. The two began talking, and when “Dani” asked DeMay to send her a sexually explicit photo of himself, he complied. That’s when the conversation took a turn.
According to a Department of Justice indictment issued in May 2023, the scammer on the other end of what turned out to be a hacked account began threatening to share DeMay’s photos widely unless he paid $1,000. When DeMay sent the scammer $300, the threats continued. When DeMay told the scammer he was going to kill himself, the scammer wrote back, “Good. Do that fast.”
Just hours after receiving the first message, DeMay was dead.
Buta says she’d never heard the term “sextortion” until law enforcement officials introduced her to it as a means of explaining what had happened to her son. But since his story became public, Buta has heard countless stories using that term, many of them similar to her son’s. “It’s parents saying, This happened to my child. This happened to my child. This happened to my child,” Buta says. “It’s so much bigger than I think any of us realize.”
DeMay is one of at least 20 minors who the FBI says died by suicide after being targeted in financially motivated sextortion schemes since 2021. But these cases represent just a sliver of the broader surge in reports of sextortion targeting minors in recent years. Between August 2022 and August 2023 alone, an estimated 812 reports of financial sextortion have poured into the National Center for Missing and Exploited Children (NCMEC)—every week, on average.
Now, a sweeping new analysis conducted by NCMEC and the anti-child exploitation nonprofit Thorn is taking a closer look at those reports to find answers on how and where these crimes are taking place, what’s motivating the people behind them, and, perhaps most importantly, what more must be done to prevent additional kids from ending up like DeMay.
“When we lack the data to quantify the scale and the evolution of this issue, we aren’t always prepared to deploy resources as efficiently as is necessary,” says Melissa Stroebel, head of research and insights at Thorn and a coauthor on the paper.
The predator’s new playbook
To conduct their analysis, a team of researchers studied more than 15 million reports of child exploitation submitted to NCMEC between 2020 and 2023, scanning for those that appeared related to sextortion. Over the course of their analysis, the researchers analyzed thousands of individual reports.
They found that while sextortion is hardly a new phenomenon, it is changing shape. As recently as 2016, research showed sextortion schemes generally targeted young girls who were exploited by people they knew or had met online. In those cases, the perpetrators’ primary motive was to extract sexual content from their victims or to coerce them into some kind of sexual or relationship-related demands. This new analysis finds that the latest wave of financial sextortion overwhelmingly targets teenage boys, with 90% of the NCMEC reports that listed age and gender involving boys between 14 and 17 years old. This aligns with demographic data that has been reported by the FBI.
These schemes also tend to be carried out not by people these boys know, but by organized networks operating largely out of Nigeria and Côte d’Ivoire whose only real motive is money. This mercenary form of sextortion has dramatically accelerated both the scale and speed at which kids are being victimized, says Lauren Coffren, executive director of the exploited children division at NCMEC and one of the researchers on the project. “Grooming in other scenarios would take place over the course of days, weeks, or months, even, to be able to garner someone’s trust,” Coffren says. “Now, it’s happening within minutes and hours.”
Many reports suggested the scammers were working off of scripts, using verbatim language to threaten victims. “Blocking me won’t stop me from posting it viral,” read one such line that the researchers found in at least four separate reports. “You lose a lot of things – your honor – your dignity – your family life,” read another.
In the majority of cases, the scammers catfish their victims, using hacked or fake accounts to trick them into taking sexually explicit photos. And yet, the researchers found that, in a small percentage of cases, scammers were using fake photos of the victims to exploit them. This pattern of abuse, Stroebel warns, may only increase in a world of generative AI, where phony images look more realistic and could further deter kids from seeking help for fear of judgment. “How is a parent going to respond if their child comes to them and says, ‘Mom, I didn’t do this,’ and with their own eyes, they’re looking at a picture that they’re having a hard time differentiating,” she says.
The role of platforms
According to the analysis, reports of financial sextortion to NCMEC spiked dramatically in 2022, which also coincides with the increase in reports cited by the FBI. But that uptick, the researchers caution, may have as much to do with the increase in actual incidences as it does with the fact that some tech companies are getting better at spotting and reporting these schemes. The vast majority of NCMEC reports the researchers studied came from platforms, particularly Facebook and Instagram, which is also the most frequently cited platform where perpetrators first contact their victims.
Stroebel warns that Instagram’s prevalence in reporting data should not in and of itself be viewed as a condemnation of the platform. “At times, increased reporting is a good sign,” she says, “because it represents greater engagement and investment in the issue.”
In a statement to Fast Company, a Meta spokesperson called sextortion a “horrific crime” and echoed Stroebel’s assessment of its higher reporting numbers. The spokesperson said the company is also, for example, working with NCMEC to train global law enforcement officials on how to respond to reports of suspected sextortion and also helped found a novel site that helps victims of child sexual exploitation get nude images removed from platforms using hashing technology. In April, Meta announced another slew of actions to prevent sextortion, including auto-blurring nude imagery in teenage users’ direct messages on Instagram.
The Thorn research also highlights shortcomings in some platforms’ reporting patterns. In reports that were submitted by the public, rather than by platforms, the researchers found that Snapchat, for instance, was mentioned almost as often as Instagram. And yet, during the study period, the company submitted about one quarter as many reports of suspected sextortion as Instagram, though the researchers noted Snapchat’s reporting processes did pick up after their research ended.
A Snap spokesperson said the comparison to Instagram is imperfect, given that Instagram is much larger, but noted that the company has instituted a number of new safeguards since last summer, including enhanced reporting tools specifically related to financial sextortion. “We have extra safeguards for teens to protect against unwanted contact, and don’t offer public friend lists, which we know can be used to extort people,” the spokesperson said. “We also want to help young people learn the signs of this type of crime, and recently launched in-app resources to raise awareness of how to spot and report it.”
Payment platforms, which victims use to send scammers money, also have a role to play in reporting suspected sextortion. But while Cash App was the payment app most frequently mentioned in public reports to NCMEC, the researchers said they found no reports to NCMEC during the study period.
A spokesperson for Cash App said the company “takes allegations of sextortion very seriously” and has worked with NCMEC to uncover child exploitation trends. The spokesperson said, “Cash App intends to report suspicious activity to NCMEC this year, and we are actively working to operationalize our reporting systems.”
Of course, while the researchers want companies to be more proactive in reporting, by the time those reports are made, it is often too late for the children who have been victimized. On average over the last two years, the researchers found, it takes seven days for a platform to report a suspected sextortion incident to NCMEC. In that time, kids and families can face devastating impacts. In the small subset of reports that discussed the impact on victims, the researchers found that nearly 16% had discussed suicide or self-harm. Another roughly 20% of those reports said that the illicit images were actually disseminated.
All of that means that platforms must be even more proactive in preventing sextortion, not just spotting it after the fact. Earlier this year, Meta, for one, changed its settings for teens so that their ability to accept messages from people they don’t follow is turned off by default. The company also rolled out warning screens for teens who are messaging with suspicious or scammy accounts.
Changing conversations
Lawmakers, meanwhile, are trying to force changes through legislation, including the STOP CSAM Act, which would, among other things, allow victims to sue tech platforms that promote or facilitate child exploitation. The Kids Online Safety Act, meanwhile, would create a broad “duty of care” for platforms to protect kids from online harms. At a Senate hearing on child safety earlier this year, the CEOs of Meta, TikTok, Snap, and X testified before an audience full of victims’ families. Last week, Senate Majority Leader Chuck Schumer said he was “completely committed” to working with those families to get KOSA across the finish line.
But it’s not just platforms that need to change, Stroebel argues. One clear finding in the report is that scammers tend to threaten victims using the same concepts that we, as a society, use to discourage kids from taking nude images. We warn kids that their nude photos will be leaked, that they’ll lose job opportunities, or face other public embarrassment. When scammers then parrot back those same ideas to kids, Stroebel says, it only entrenches the idea that those negative consequences will follow, making kids even more vulnerable to coercion.
“We need to pivot those conversations slightly to make sure that our attempt to deter the behavior isn’t actually having the impact of deterring help-seeking,” Stroebel says. “Kids are making a call that they’re better off trying to handle this on their own than by seeking support.”
Last year, the Department of Justice indicted three suspects from Nigeria in Jordan DeMay’s case. In April, two of them pleaded guilty to conspiracy to sexually exploit teenagers. The third is now facing extradition from Nigeria. “There’s some justice and accountability for Jordan,” Buta said the day the plea deal was announced.
At the time of the indictment, the men who targeted her son were 19, 20, and 22-years old themselves, a fact that struck Buta particularly hard. “These young men are not that much older than my son, and their lives are changed, not in the same way that my son’s is, but this did not need to happen,” she says. “And as a mom, I felt for their mothers.”
No comments