Sign Up
Stories
Meta's Efforts to Restrict Harmful Content
Share
B.C. Takes Action Against Child Safety
Canada's Child Online Safety Bill
Digital Regulations Reshaping Social Med...
Anonymity Fuels False Voter Information
Controversy over Inmate Social Media Pun...
EU Probes Meta for Disinformation
Overview
API
Meta, parent company of Instagram and Facebook, is restricting harmful content for teen accounts, including self-harm, graphic violence, and eating disorder content. The company is automatically limiting such content and sending notifications to encourage privacy setting updates. This comes amidst increased scrutiny and a lawsuit from over 40 states accusing Meta of contributing to young users' mental health problems.
Ask a question
How does this news reflect the ongoing debate over the responsibility of tech companies for the well-being of their users?
How might the restrictions on harmful content impact user engagement and platform usage?
What measures can Meta take to address the concerns raised by states and the public regarding the impact on young users?
Article Frequency
0.2
0.4
0.6
0.8
1.0
Oct 2023
Nov 2023
Dec 2023
Coverage