Meta executives moved forward with plans to implement end-to-end encryption across Facebook Messenger and Instagram Direct despite internal warnings that the change could significantly reduce the company’s ability to detect and report child exploitation cases, according to court documents filed in New Mexico.
The internal communications, revealed in a lawsuit brought by New Mexico Attorney General Raul Torrez, show senior policy leaders expressing alarm ahead of CEO Mark Zuckerberg’s 2019 announcement promoting encrypted messaging. In one exchange, Meta’s Head of Content Policy, Monika Bickert, warned, “We are about to do a bad thing as a company,” criticizing what she described as overstated claims about maintaining safety standards under encryption.
End-to-end encryption ensures that only the sender and recipient can read messages, a privacy feature widely used in apps such as WhatsApp, Apple’s iMessage, and Google Messages. However, child safety advocates argue that integrating encryption into social media platforms like Facebook and Instagram increases risks by limiting proactive monitoring for child abuse, sextortion, terrorism threats, and school violence.
According to internal briefing documents cited in the case, Meta estimated that if Messenger had been encrypted in 2018, reports of child nudity and sexual exploitation imagery to the National Center for Missing and Exploited Children would have dropped by 65%, from 18.4 million to 6.4 million. Additional projections suggested the company would have been unable to proactively provide law enforcement data in hundreds of child exploitation and sextortion investigations.
The lawsuit alleges Meta misrepresented the safety implications of its encryption rollout and failed to adequately protect minors from online predators. The trial marks the first jury case of its kind against Meta related to child safety and human trafficking risks.
Meta has stated that concerns raised in 2019 led to the development of enhanced safety tools before encrypted messaging was fully launched in 2023. The company says users can still report abusive content, and new protections restrict adults from messaging minors they do not know.
As global scrutiny intensifies over social media and youth mental health, the outcome of the New Mexico case could have significant implications for online privacy, platform accountability, and child protection laws.


CTOC Adds 3,000 Doctors, 500 Hospitals Ahead of Liquidity Push
Meta and Google just lost a landmark social media addiction case. A tech law expert explains the fallout
Ukrainian Drones and the #MadeByHousewives Movement: Kyiv Fires Back at Rheinmetall CEO
Microsoft's $10 Billion Japan Investment: AI Infrastructure and Data Sovereignty Push
Fonterra Admits Anchor Butter "Grass-Fed" Label Misled Consumers After Greenpeace Lawsuit
Microsoft Eyes $7B Texas Energy Deal to Power AI Data Centers
OpenAI Executive Shake-Up Ahead of Anticipated 2026 IPO
NASA's Artemis II Mission: First Crewed Lunar Journey Since Apollo
Stellantis Shareholder Fraud Lawsuit Dismissed by U.S. Judge
MATCH Act Targets ASML and Chinese Chipmakers in New U.S. Export Crackdown
SpaceX Eyes Historic IPO at $1.75 Trillion Valuation
Cybersecurity Stocks Tumble After Anthropic's Claude Mythos AI Leak Sparks Market Fears
Paramount Skydance Secures $24B from Gulf Sovereign Wealth Funds for Warner Bros. Discovery Takeover
U.S. Appeals Court Strikes Down FTC Order Against TurboTax "Free" Advertising
Bolsonaro Hospitalized in ICU with Bronchopneumonia Amid Calls for House Arrest
Elon Musk Ties SpaceX IPO Access to Mandatory Grok AI Subscriptions
Jerome Powell May Stay on Fed Board Amid Criminal Investigation, Court Documents Reveal 



