Instagram head pressed on lengthy delay to launch teen safety features, like a nudity filter, court filing reveals

Prosecutors in a lawsuit focused on whether or not social media apps, like Instagram, are addictive and harmful, wanted to know why it took so long for Meta to roll out basic safety tools, like a nudity filter for private messages sent to teens. In April 2024, Meta introduced a feature that would automatically blur explicit images in Instagram DMs — something the company reportedly understood to be an issue nearly six years prior.

In a newly unsealed deposition in a federal lawsuit, Instagram head Adam Mosseri was asked about an August 2018 email chain with Meta VP and Chief Information Security Officer Guy Rosen, where he mentioned that “horrible” things could happen via Instagram private messages, also known as DMs. Those horrible things could include dick pics, the plaintiff’s lawyer said, and Mosseri agreed.

However, the Meta exec pushed back at the line of questioning that suggested the company should have informed parents that its messaging system wasn’t monitored, beyond removing CSAM (Child Sexual Abuse Material).

Recommendations