A typical child pornography case begins with a Cybertip by an Electronic Service Provider (ESP) (such as Google, Apple iCloud, Kik, Flingster) that a user has uploaded or shared suspected Child Sexual Abuse Material (CSAM) on the service.
Detection of Suspected Child Sexual Abuse Material
While Internet Service Providers (ISP) and Electronic Service Providers are not obligated by law to search for CSAM or child pornography on their systems, many do because the damage to business reputation of being seen as a haven for such material. Moreover, ESPs can be shut down if they are seen as allowing such material. In 2023, Leif K-Brooks, owner of the video chat service Omegle, shut the service down after 14 years of operation, even though it scanned its service and reported suspected CSAM. K-Brooks cited the abuse of Omegle by traffickers in CSAM as contributing to his decision to shut the service down.
Nearly all ESPs run software in the background of their services to automatically monitor for known CSAM. Some services employ people or engage volunteers who monitor the chat or distribution services they operate.
But, owing to the prevalence of CSAM on systems or the danger that CSAM could proliferate, background automated software is the primary tool to compare the hash values of known CSAM to content that has been uploaded, shared, or distributed on the ESP.