Yahoo's Efforts to Combat Child Sexual Abuse Material (CSAM)

Yahoo prohibits child sexual abuse material (CSAM) under our Terms of Service and our Community Guidelines. Yahoo operates a robust digital safety program designed to detect and remove CSAM from our platforms. Yahoo’s efforts are spearheaded by a dedicated Trust & Safety team.

How Yahoo identifies CSAM

Yahoo uses a combination of automated scanning and human review to detect CSAM as permitted by the law:

  • Yahoo uses PhotoDNA and CSAI Match technologies, which are capable of matching the digital signature of an uploaded image or video to large databases of known CSAM maintained by the National Center for Missing and Exploited Children (NCMEC);
  • Our team of expert human reviewers evaluate the images detected through automated scanning to ensure that all confirmed CSAM in an account is reported to NCMEC; and
  • Finally, we carefully review and take action on abuse reports sent to us by users and by child-safety organizations like NCMEC.

What Yahoo does when it finds CSAM

Yahoo reports all CSAM to NCMEC and includes subscriber information for the user who uploaded the CSAM, which helps NCMEC identify the alleged offender. NCMEC acts as a clearinghouse for US law enforcement and sends reports for offenders located outside of the US to partner agencies in the relevant country.

After Yahoo files reports with NCMEC, Yahoo’s investigators identify particularly serious cases that warrant further investigation. By leveraging internal data and open-source information, Yahoo’s investigators are often able to identify and locate offenders responsible for uploading CSAM to Yahoo’s platforms. Yahoo then transmits this information to NCMEC in the form of a supplemental report. These supplemental reports have resulted in hundreds of child rescues and arrests, in some cases less than 24 hours after filing.

Yahoo also responds to search warrants and other legal processes obtained by U.S. and international law enforcement in accordance with our Global principles for responding to government requests.

How many accounts Yahoo reports to NCMEC

In 2024, Yahoo reported 2,303 accounts to NCMEC for trafficking in CSAM on our platforms.

Yahoo submitted a total of 2,441 Cybertips to NCMEC. This total includes supplemental Cybertip reports escalated to NCMEC.

A Cybertip may include one or multiple pieces of content depending on the account. Content can include images, videos or text soliciting CSAM. Total Content Reported by Yahoo: 114,629


In 2023, Yahoo reported 2,431 accounts to NCMEC for involvement in CSAM related activity on our platforms. Additionally, 183 supplemental reports were submitted in connection with these accounts.

In 2022, Yahoo reported 3,329 accounts to NCMEC for involvement in CSAM related activity on our platforms. Additionally, 234 supplemental reports were submitted in connection with these accounts.

In 2021, Yahoo reported 5,498 accounts to NCMEC for involvement in CSAM related activity on our platforms. Additionally, 341 supplemental reports were submitted in connection with these accounts.

In 2020, Yahoo reported 7,182 accounts to NCMEC for involvement in CSAM related activity on our platforms. Additionally, 397 supplemental reports were submitted in connection with these accounts.

In 2019, Yahoo reported 5,359 accounts to NCMEC for involvement in CSAM related activity on our platforms. Additionally, 458 supplemental reports were submitted in connection with these accounts.

call-out iconReport CSAM and child sexual exploitation online - If you encounter CSAM or believe a child is being sexually exploited online, report it directly to NCMEC through the CyberTipline.