Challenges in the Online Child Safety Ecosystem

How to improve the system for reporting child sex abuse material online. 

U.S. Marshals work with the NCMEC during Operation We Will Find You, in 2023

In the United States, federal law mandates that online platforms report any discovered child sexual abuse material (CSAM) to the National Center for Missing and Exploited Children (NCMEC), a nonprofit organization, through a reporting mechanism called the CyberTipline. NCMEC then sends these CyberTipline reports to law enforcement in the U.S. and abroad. In the U.S., most reports are sent to one of the dozens of Internet Crimes Against Children Task Forces throughout the country. These task forces may choose to close the reports, investigate the reports themselves, or send them out to local law enforcement agencies. 

It’s widely recognized among child safety experts that law enforcement officers are overwhelmed by the volume of reports from the CyberTipline and lack the time and resources to investigate all of the reports that merit attention. Our conversations suggest stakeholders across the ideological spectrum believe that this results in missed opportunities to investigate reports that would lead to the rescue of a child being abused.

In a newly released report, we highlight one reason why the CyberTipline system isn’t working as well as it should: Officers feel ill equipped to accurately prioritize reports for investigation. Officers believe they expend precious time on reports that in hindsight they should have deprioritized, while overlooking reports that seem low priority but, if investigated, could reveal ongoing child abuse. Through interviews with 66 respondents at online platforms, NCMEC, law enforcement agencies, and civil society groups, we documented the reasons why law enforcement struggles to accurately triage reports for investigation, and make recommendations for potential solutions. 

Confoundingly Similar Reports 

Imagine being an Internet Crimes Against Children Task Force detective. You arrive at the office to find eight new CyberTipline reports from that morning alone. In addition to managing a handful of high-priority ongoing investigations, you must assess these reports and decide which ones you’ll investigate, which can be handed off to local law enforcement, and which ones require no further action.

The first two reports look similarly low priority: reports from a social media platform of individuals posting a single CSAM image. Both images are decades old and familiar to you; the children depicted are now adults and have received support from law enforcement. The platform has provided names for the uploaders, though they’re likely fake. Based on previous experience, these reports rarely lead to further illicit activity. However, sometimes they involve an individual actively abusing a child, detected due to a slipup that placed them on a platform’s radar. You stare at these reports for a few minutes, weighing whether either is worth investigating further.

Our research suggests that this common scenario arises for several reasons. First, many CyberTipline reports from platforms are low quality. They may lack basic information about the uploader or essential information such as the upload IP address. This can stem from a reluctance by platforms to allocate engineering resources toward ensuring reports are completed accurately, though the long tail of low-volume reporters—often small platforms—may not be fully aware of what constitutes useful information for a report. These challenges can be due to frequent staff turnover within trust and safety teams along with the absence of formal documented guidance from NCMEC (possibly due to legal constraints) or any other group. 

The responsibility of prioritizing between these two similar-looking reports doesn’t have to fall solely on law enforcement. There are untapped opportunities for NCMEC to enrich these reports with additional data sets. For instance, identifying if either of the uploaders is associated with an IP address known for sharing CSAM on peer-to-peer networks would help law enforcement decide if either of the reports is worth prioritizing. As discussed next, there are additional opportunities for NCMEC to further automate processes to facilitate law enforcement triage.

Multiple Reports for the Same Offender

The next report has been escalated to you for containing new CSAM. It’s a horrific video depicting a very young child being abused. You meticulously review the video, searching for any clues that could reveal the uploader’s identity. As you scan the report you notice that the username—bobthecat4—is nearly identical to one from a report from a different platform that you handled a few weeks ago—bobthecat3. Further evidence suggests that this is the same user. You realize that you’ve inadvertently spent 30 minutes on a redundant task, since you’d already forwarded the previous report to a local affiliate.

NCMEC has recently started to link reports using similar identifiers, but not all identifiers undergo this process. We heard from respondents that technical improvements to the CyberTipline are slow, particularly when compared to the speed of similar updates in industry. There are many reasons for this: As a nonprofit, NCMEC cannot match the salaries for engineers that industry offers, leading to frequent poaching of NCMEC staff by industry trust and safety teams. Further, even when NCMEC notes that two reports are linked on a CyberTipline interface that is freely available to law enforcement, the diversity in case management software used by law enforcement agencies to view CyberTipline reports—both in the U.S. and internationally—means that the linkage might not be visible. In addition to grouping important related reports, there’s the challenge of correctly identifying and marking low-priority reports.

The Meme Problem 

The next three reports all indicate a user sharing (for poor comedic intent) a meme depicting a nude child in a sexually suggestive pose. It’s a well-known meme in your office; you get hundreds of reports containing this exact same meme every year, and the child depicted is not in need of rescue. Sharing this content is illegal, but the sharers did not intend to cause harm, and given your workload these are absolutely not a priority. It takes you 15 minutes to close out these reports—time that could have been spent investigating a high-priority case.

An unknown but significant portion of CyberTipline reports are memes that meet the CSAM definition. Although there are scenarios where distributing a meme could warrant investigation—such as an adult sending it to a minor—in practice, these reports are often deemed low priority by law enforcement. When platforms accurately use the “Potential Meme” CyberTipline report checkbox, NCMEC classifies these as “informational,” allowing U.S. law enforcement to easily disregard them. However, many reports containing memes fail to have this checkbox marked. This may be due to slight alterations to images that challenge automatic meme recognition or a platform’s lack of investment in the necessary engineering to ensure the checkbox is used when they create the reports. Not all challenges in the process, however, come with straightforward solutions.

Fourth Amendment Issues

For the next report, you can’t immediately open the file that the platform reported. The file was detected based on a hash match to known CSAM, but according to the report, the platform did not have a person review the file (likely not wanting to expose their content moderators to unnecessary trauma). But some court rulings interpreting the Fourth Amendment have held that you, as a law enforcement officer, cannot actually view the file without a search warrant. Other courts have ruled that no warrant is necessary if the unopened file is a hash match to known CSAM. The courts in your area haven’t yet addressed this issue, but to be on the safe side, you won’t forgo warrants and rely on hash matches alone. If you do, you risk undermining any subsequent prosecution because the file might be excluded from evidence as the fruit of an unlawful warrantless search. 

Before sending you the report, NCMEC likewise did not open the file since nobody at the platform had reviewed it. That’s due to a court ruling that NCMEC counts as a government entity or agent, meaning it’s bound by the same Fourth Amendment rules as you are. So far, courts have declined to hold that platforms are also government agents, because the platforms’ searches for CSAM (such as the one that detected this file) are voluntary. If, however, you (or NCMEC) were to directly tell a platform what to search for and what information to report, that might turn the platform into a government agent, which, again, would risk exclusion of evidence found through its (currently voluntary) searches. You decide to invest the time in getting the search warrant, only to discover that the file was the same meme depicted in the earlier reports. It’s frustrating that you can’t tell the platform how to send you more actionable reports than this, but you respect that it’s crucially important for the platform to maintain the independence of its CSAM search program.

Sextortion Reports

Sometimes, instead of images or videos, the files are chat transcripts. The next report includes a chat transcript from a platform that supports direct messaging. From reading just a few lines of the chat you immediately know you are looking at a case of financial sextortion—an offender tricked a child into sending a nude image and is now threatening to send the image to friends unless the child sends money. The victim is a child in your state, and you send this report to a local police department that will reach out to the child and their family to provide support. This is important, as sextortion victims sometimes die by suicide. Based on recent trends, the offender is almost certainly in West Africa, and there isn’t much you can do about that. 

Sextortion cases constitute a growing portion of CyberTipline reports, often orchestrated by organized crime groups. These groups typically use single-use email addresses and phone numbers but will reuse sentences in their chats. There is untapped potential for both platforms and NCMEC to communicate in the CyberTipline report that reports are linked based on identical messaging. This information would ensure linked reports are sent to the same law enforcement agency to avoid duplication of effort. Additionally, the CyberTipline form lacks a structured field for chat transcripts, hindering opportunities for automatic translation to or from other languages (which is a significant issue since the vast majority of reports are routed internationally). Some platforms upload chat logs as file attachments, which may not be reviewed by platform staff. Without this review, law enforcement may need a search warrant just to read the text.

Recommendations

The CyberTipline provides a single clearinghouse for reports of online child sexual exploitation, and the system leads to children being rescued and offenders being arrested. Across the board, respondents said they were glad it exists. Our interviews suggest, however, that a number of changes would improve the process.

Our main recommendation is a coordinated effort to significantly enhance NCMEC’s technical and analytical capabilities. Achieving this will require collaboration among platforms, NCMEC, law enforcement, and, crucially, the U.S. Congress. Congress should allocate additional funding to NCMEC (without reducing funding from the task forces) to recruit engineers at competitive salaries that incentivize retention. NCMEC in turn should prioritize investment in the technical infrastructure of the CyberTipline.

Second, online platforms should invest resources in completing CyberTipline form fields accurately and consistently. Staffing child safety teams with investigators to supplement reports will also increase the likelihood that law enforcement investigates the report. Relatedly, due to concerns that guidance from NCMEC might turn platforms into government agents if it is too forceful (since NCMEC is also considered a government agent), a group other than NCMEC should publish a guide detailing the essential fields of the CyberTipline form that platforms should fill out to increase the chances of law enforcement investigating their reports. 

Third, the U.S. Supreme Court should address the split in judicial opinion regarding whether the private search doctrine necessitates manual inspection by platform staff in order for NCMEC and law enforcement to open a reported file without a warrant, or whether instead a hash match to known CSAM is sufficient to satisfy the private search doctrine. 

Fourth, we recommend legal changes that extend the time platforms must preserve reported content. On April 29, Congress passed the REPORT Act, which incorporates this recommendation by extending the preservation period to one year (among its other provisions); it now heads to the President’s desk for signing.

Shelby GrossmanRiana PfefferkornSara Shah, Published courtesy of Lawfare

No Comments Yet

Leave a Reply

Your email address will not be published.

©2024. Homeland Security Review. Use Our Intel. All Rights Reserved. Washington, D.C.