UNDERSTANDING CSAM

CSAM stands for Child Sexual Abuse Material and includes any photos or videos that show minors- anyone under 18— engaging in explicit sexual acts. Regardless of context or consent, a child cannot legally consent to sexual activities. In the United States, the federal statutes define it as “child pornography,” but in federal law, CSAM is more accurate as it captures the abuse and trauma inflicted on the child. [1]

Unfortunately, there are perpetrators all over the world who are using CSAM for their own sexual needs. And this is one of the reasons why you should never post pictures of your child online. Perpetrators can go online and search on the dark or clean web to find CSAM. Some will go out of their way to Photoshop a child’s face on a woman’s body, engaging in sexual acts. And yes, there are real cases out there that it has happened.

Between 2020 and 2021, there was an increase in CSAM due to individuals being at home more because of the COVID pandemic. This gave perpetrators room and time to interact with others on forums, spreading information on how and where to find CSAM online. In 2019, CSAM was reported to have gone from 1 million to 2 million in 2020. The Internet Watch Foundation (IWF) said there were around 63,050 CSAM of children aged 7-10, in these cases, kids could have been groomed or tricked into performing sexual acts on camera by a perpetrator. [2]

AI ROLE IN CSAM

The internet is a dangerous place for a child to be, and now we have to worry about artificial intelligence. What is AI? AI, artificial intelligence is a computer system that can perform many tasks, such as solving problems and analyzing data, images, and videos. [2] AI programs such as ChatGPT have skyrocketed over the past few years. We have to remember that programs such as ChatGPT are tools we can use and not abuse.

How does AI play a role in CSAM? It is accessible for perpetrators to create CSAM in a few seconds by typing phrases and entering a picture of a child. AI can create Deepfakes in seconds and this is just the beginning of it. In 2017, was the first appearance of GenAI CSAM when a Reddit user used AI to create a Deepfake of female celebrities' faces without using their consent to generate pornographic content. [2] CSAM was photographs and videos; now, there is a new worry of generative images and videos of CSAM. CSAM is exploiting and bringing these children to vulnerable places, where they can be targeted to be groomed, stalked, and abused.

DARK WEB AND CHILD EXPLOITATION

The dark web is the last layer of the internet. There are three layers, the top layer is the surface web. Surface web is open to the public, and anyone can go on the internet. Second, is the deep web which contains 90% of information on the Internet. Examples are legal documents, academic information, and government resources. The last layer is the dark web. The dark web is encrypted and browsers like Tor are used to surf the dark web and to mask anyone who is on it, keeping it anonymous.

What individuals can find on the dark web? Information that you don’t want to know or see. Someone can traffic illegal guns, drugs, exotic animals, humans, and children. Scammers or hackers can sell or buy your information, such as credit card numbers and social security numbers.

Perpetrators can use the dark web to distribute and exploit CSAM. Also, criminals have generated income from CSAM. There have been reports that terrorists were caught using CSAM to send messages and data and finance their activities. This illustrates how CSAM is connected not only with perpetrators but also with organized crime. [1]

DETECTING TOOL TO REMOVE CSAM

PLEASE NOTE: Digital Betrayal is not endorsed by or affiliated with Project Arachnid. The references to Project Arachnid on our platform are solely for educational purposes. For more information about Project Arachnid and its mission, please visit their official website at Project Arachnid.

The spread of CSAM globally has affected millions of children around the world. The Canadian Centre for Child Protection (C3P) created a software platform called Project Arachnid in 2017. Project Arachnid is a detection tool that goes on the web and detects any CSAM if it is detected, processed, and removed. [2]

Based on the Project Arachnid website, as of February 3, 2025, 172 billion+ images have been processed, 96 million+ suspect media triggered for analyst review and 40 million+ takedown notices issued to online service providers.

Detection tools like Project Arachnid will help take CSAM off the web and interrupt the trade of CSAM on the dark web.


Sources

[1] Child Sexual Abuse Material: Model Legislation & Global Review. (2023). In cdn.icmec.org. International Centre for Missing & Exploited Children (ICMEC). https://cdn.icmec.org/wp-content/uploads/2023/10/CSAM-Model-Legislation_10th-Ed-Oct-2023.pdf

[2] Disrupting the worldwide spread of child sexual abuse material: Project Arachnid marks eight years. (2025, January 28). Protectchildren.ca; Canadian Centre for Child Protection. https://protectchildren.ca/en/press-and-media/news-releases/2025/project-arachnid-anniversary