Blog

12-23 Warning: How Criminals Are Hacking Your Children’s Social Media Using Just Three Seconds of Their Voice and Turning It Into a Terrifying AI Scam to Trick Parents

Criminals are hacking children’s social media accounts and cloning their voices using AI to trick their parents into sending them money, MailOnline can reveal.

Even the most basic scammers are using simple AI tools online to turn just three seconds of a child’s voice into deepfakeaudio files with an 85 to 90 per cent match, security experts have warned.

With more effort spent on cloning, hackers can even achieve a 95 per cent voice match, research from security firm McAfee shows, leaving parents at risk of being exploited by ruthless fraudsters.

The more accurate the clone, the better chance a criminal has of preying on the vulnerability of a friend or family member and tricking them into handing over their money. One influencer, with nearly 400,000 followers on TikTok, revealed how her mother was woken up in the middle of the night to a call from an unknown number. When she answered, it was her daughter’s voice – which had been cloned – screaming for help. It comes as fraud experts issued fresh warnings over the latest ‘hi mum’ texts where scammers ‘prey on our goodwill with emotive stories’ and con parents into thinking their children are in trouble.

Read more here. 

lynnswarriors12-23 Warning: How Criminals Are Hacking Your Children’s Social Media Using Just Three Seconds of Their Voice and Turning It Into a Terrifying AI Scam to Trick Parents
read more

12-22 Pornhub Will Pay More than $1.8M to Resolve GirlsDoPorn Sex Trafficking Probe: Feds

Pornhub’s parent company has agreed to pay more than $1.8 million to the government after admitting it profited from videos it hosted from the sex trafficking site GirlsDoPorn, federal prosecutors in Brooklyn announced Thursday.

The company, Aylo Holdings, entered into a deferred prosecution agreement Thursday to resolve a criminal money laundering investigation into whether Pornhub knew the money from GirlsDoPorn came from illegal activities.

In addition to the seven-figure fine, it’s also required to pay restitution to the website’s hundreds of sex trafficking victims. The details of those payments are still being worked out, but prosecutors said victims could get a minimum $3,000 or more.

Pornhub’s owners also agreed to the appointment of an independent monitor for three years.

Read more here. 

lynnswarriors12-22 Pornhub Will Pay More than $1.8M to Resolve GirlsDoPorn Sex Trafficking Probe: Feds
read more

12-21 Top AI Image Generators are Getting Trained on Thousands of Illegal Pictures of Child Sex Abuse, Stanford Internet Observatory Says

Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take action to address a harmful flaw in the technology they built.

Those same images have made it easier for AI systems to produce realistic and explicit imagery of fake children as well as transform social media photos of fully clothed real teens into nudes, much to the alarm of schools and law enforcement around the world.

Until recently, anti-abuse researchers thought the only way that some unchecked AI tools produced abusive imagery of children was by essentially combining what they’ve learned from two separate buckets of online images — adult pornography and benign photos of kids.

But the Stanford Internet Observatory found more than 3,200 images of suspected child sexual abuse in the giant AI database LAION, an index of online images and captions that’s been used to train leading AI image-makers such as Stable Diffusion. The watchdog group based at Stanford University worked with the Canadian Centre for Child Protection and other anti-abuse charities to identify the illegal material and report the original photo links to law enforcement.

The response was immediate. On the eve of the Wednesday release of the Stanford Internet Observatory’s report, LAION told The Associated Press it was temporarily removing its datasets.

LAION, which stands for the nonprofit Large-scale Artificial Intelligence Open Network, said in a statement that it “has a zero tolerance policy for illegal content and in an abundance of caution, we have taken down the LAION datasets to ensure they are safe before republishing them.”

While the images account for just a fraction of LAION’s index of some 5.8 billion images, the Stanford group says it is likely influencing the ability of AI tools to generate harmful outputs and reinforcing the prior abuse of real victims who appear multiple times.

Read more here.

lynnswarriors12-21 Top AI Image Generators are Getting Trained on Thousands of Illegal Pictures of Child Sex Abuse, Stanford Internet Observatory Says
read more

12-20 A SECOND Landmark U.S. Class Action Lawsuit Against P*rnhub and Its Parent Company on Behalf of Thousands of Child Trafficking Victims was Certified Today by a Federal Judge

A second class action alleging the company behind Pornhub and other adult video websites profits from child sexual abuse material and trafficking is moving forward after a decision by an Alabama federal judge.

The class certification paves the way for potentially thousands of victims who appeared in CSAM posted on Pornhub to see relief through the courts.

Read more here. 

lynnswarriors12-20 A SECOND Landmark U.S. Class Action Lawsuit Against P*rnhub and Its Parent Company on Behalf of Thousands of Child Trafficking Victims was Certified Today by a Federal Judge
read more

12-16 GREAT NEWS! PASSED: Blackburn, Ossoff Bill To Update Missing And Exploited Children CyberTipline

the U.S. Senate unanimously passed the Revising Existing Procedures on Reporting via Technology (REPORT) Act, authored by U.S. Senators Marsha Blackburn (R-Tenn.) and Jon Ossoff (D-Ga.).

“In today’s technological age, children have become increasingly vulnerable to online sexual exploitation,” said Senator Blackburn. “There is an urgent need to address loopholes in reporting these crimes and to equip the National Center for Missing and Exploited Children and law enforcement with the resources they need to adequately respond. I thank Senator Ossoff for his bipartisan partnership in this effort, and I look forward to the REPORT Act’s swift passage out of the House and to the President’s desk.”   

“Our bipartisan bill will ensure tech companies are held accountable to report and remove child sex abuse material and to strengthen protection for kids online,” said Senator Ossoff. “At a time of such division in Congress, we brought Republicans and Democrats together to pass this urgent legislation to protect kids on the internet.”

  • The REPORT Act is endorsed by the National Center for Missing and Exploited Children (NCMEC), International Justice Mission (IJM), ECPAT-USA, the National Center on Sexual Exploitation (NCOSE), Fraternal Order of Police, ChildFund International, the End Online Sexual Exploitation and Abuse (OSEAC) Coalition, Wired Human, and Raven.

“The National Center for Missing and Exploited Children (NCMEC) applauds the hard work by Senator Blackburn, Senator Ossoff, and all the Senate co-sponsors to pass the REPORT Act,” said Michelle DeLaune, NCMEC President & CEO. “Last year NCMEC’s CyberTipline received over 32 million reports containing more than 88 million images/videos and other content concerning child sexual exploitation, and the volume of online child sexual exploitation continues to exponentially rise. We look forward to continuing our work with Senator Blackburn and Senator Ossoff to ensure the safety of children online, and we encourage House Leadership to join the fight and bring the REPORT Act to the floor for a vote.”

Read more here.

lynnswarriors12-16 GREAT NEWS! PASSED: Blackburn, Ossoff Bill To Update Missing And Exploited Children CyberTipline
read more

12-14 Your Daughter’s Face Could Be Hijacked For ‘Deepfake’ Porn

ourteen-year-old Francesca’s life has changed forever.

An email sent by her high school’s principal to her family on Oct. 20, 2023, notified them that Francesca was one of more than 30 students whose images had been digitally altered to appear as synthetic sexually explicit media — sometimes referred to as “deepfake” pornography.

Read more here.

lynnswarriors12-14 Your Daughter’s Face Could Be Hijacked For ‘Deepfake’ Porn
read more