Latest News

07-18 LEARN MORE. Generative AI CSAM is CSAM = Child Sexual Abuse Material #ProtectChildrenOnline

Generative AI (GAI) has taken the world by storm. While it is undeniable that this innovative technology has benefits, the National Center for Missing & Exploited Children (NCMEC) is deeply concerned about the dangers that GAI poses to the global fight for child safety. 

In 2023, NCMEC’s CyberTipline received 4,700 reports related to Child Sexual Abuse Material (CSAM) or sexually exploitative content that involved GAI technology (GAI CSAM). GAI CSAM portrays computer-generated children in graphic sexual acts and can be generated at will by the user of certain GAI platforms. GAI can also be utilized to create deepfake sexually explicit images and videos by using an innocent photograph of a real child to create a computer-generated one.

As if the creation of this imagery wasn’t terrible enough, NCMEC also has received reports where bad actors have tried to use this illegal GAI content to extort a child or their family for financial means. Furthermore, users of the technology to create this material have used the argument that, “At least I didn’t hurt a real child” and “It’s not actually a child…”

Read more here. 

lynnswarriors07-18 LEARN MORE. Generative AI CSAM is CSAM = Child Sexual Abuse Material #ProtectChildrenOnline