Senate Judiciary Chair Targets Tech Giants Over Child Safety Reporting Failures
Senate Judiciary Committee Chair Chuck Grassley (R-IA) has initiated a formal congressional inquiry into eight prominent technology companies, alleging that they have systematically failed to supply sufficient information to a cyber tipline designed to detect and combat the distribution of child sexual abuse materials (CSAM) on their platforms.
Grassley's inquiry was triggered by reports from the National Center for Missing and Exploited Children (NCMEC), which alleged that the eight companies fall significantly short in both the quality and completeness of their CSAM reporting, including data related to generative AI.
The Eight Companies Under Scrutiny
The companies named in the inquiry are Meta, Amazon AI Services, TikTok, Snapchat, Discord, X.AI, Grindr, and Roblox. According to a press release issued by Grassley, these eight firms collectively submitted more than 17 million reports of suspected online child exploitation in 2025. However, they allegedly failed to provide NCMEC with critical location data and other identifying information about users and suspects.
NCMEC also accused the companies of failing to share CSAM found in AI training datasets and of not reporting what the organization describes as sadistic online exploitation targeting children.
The significance of these companies' role in combating CSAM cannot be overstated. NCMEC confirmed that 81% of all reports received through its CyberTipline in 2025 came from these eight companies alone.
NCMEC's Stark Warning
NCMEC issued a pointed statement provided directly to Grassley, describing nearly three decades of effort to push platforms to improve their detection and reporting practices:
"For almost thirty years, NCMEC has worked tirelessly to combat online child sexual exploitation by attempting to persuade [platforms] to detect, report and remove child sexual exploitation on their platforms and improve the quality and substance of their CyberTipline reports."
The organization went further, noting that many large technology firms "regularly tout the number of reports they submit to the CyberTipline, but fail to disclose that millions of reports lack basic information." NCMEC warned that this pattern "leaves children unprotected online, subjects survivors to revictimization, enables sexual offenders to remain freely online and wastes valuable and limited law enforcement resources."
Company-Specific Findings
Grassley's press release included detailed statistics for each company's 2025 reporting activity. Key findings include:
- Meta submitted nearly 11 million reports of suspected online child exploitation to NCMEC's CyberTipline in 2025, but many of these allegedly contained consistency and quality issues that rendered them unusable for law enforcement investigators.
- Amazon AI Services submitted more than 1.1 million tips in 2025, yet allegedly none of them could be acted upon because Amazon failed to include location or suspect information.
- TikTok submitted 3.6 million reports, but allegedly reported incidents that consistently did not relate to child exploitation. According to the press release, TikTok informed NCMEC that they "are working on other high-priority items and could not commit to a timeframe to correct this reporting issue."
What Grassley Is Demanding
Grassley described himself as alarmed by the information NCMEC shared with him and is now compelling all eight technology companies to formally respond to NCMEC's allegations. He is also requiring each firm to provide detailed plans outlining how they intend to improve their handling of cyber tips throughout the remainder of the year.
How the Companies Responded
Several of the named firms issued statements in response to the inquiry.
Roblox's chief safety officer said the company is currently reviewing Grassley's letter and expressed commitment to "a productive dialogue with the Senator's office and NCMEC regarding our shared goal of keeping children safe online."
A Meta spokesperson stated that "child exploitation is a horrific crime and we work tirelessly to protect children from it, and to help bring the criminals involved to justice," adding that the company is "committed to constant improvement" and has already made some changes that NCMEC has acknowledged.
A Discord spokesperson cited a "longstanding, collaborative relationship" with NCMEC and said the company remains in regular communication with the organization to ensure it fulfills its reporting obligations.
Snap said it takes Grassley's concerns seriously and has "taken steps to strengthen our reporting processes, improve data quality, and help ensure law enforcement receives actionable information."
A Grindr spokesperson welcomed the opportunity to detail the company's child protection measures, noting that "Grindr is exclusively for adults aged 18 or over" and that the company maintains a substantial moderation team to identify and ban accounts that discuss topics related to minors, and employs AI and machine learning technology for proactive identification. The spokesperson added that the company "take[s] preventing CSAM with the utmost seriousness."
Representatives from Amazon AI Services, TikTok, and X.AI did not immediately respond to requests for comment at the time of publication.
Broader Implications
The inquiry highlights an ongoing tension between the scale at which technology platforms operate and the quality of information they provide to authorities working to protect children. While the raw volume of reports submitted by these companies is substantial, NCMEC's assessment suggests that quantity alone is not sufficient — and that incomplete reporting may actually hinder, rather than help, law enforcement efforts. Grassley's intervention signals growing congressional pressure on the technology industry to take child safety reporting obligations more seriously.
Source: The Record