Meta is facing a major negligence on child safety. CEO Mark Zuckerberg and his team misled the public by targeting young users while ignoring the risks to children. Meta’s negligence has led to lawsuits filed by the attorneys general of various states.
Why did Meta fail on child safety?
In April 2019, Meta executive David Ginsberg presented CEO Mark Zuckerberg with a research proposal to reduce loneliness and compulsive use on Instagram and Facebook. Ginsberg noted that the company was under scrutiny, particularly around problematic use/addiction and its impact on young people. However, this project was not funded due to staff shortages. Instagram’s president Adam Mosseri also refused to fund the project.
While Ginsberg’s proposal was rejected, Meta is facing lawsuits filed by the attorneys general of 45 states and the District of Columbia, exposing the company’s negligence on child safety. The lawsuits allege that Meta unfairly trapped teens and children on Instagram and Facebook and misled the public about the dangers. Raúl Torrez, US Attorney General of New Mexico, said Mark Zuckerberg was at the center of decisions targeting young users.
On Meta’s platforms, children and young people are exposed to sexual harassment, bullying, body shaming and coercive online use. American surgeon general Dr. Vivek H. Murthy has called for warning labels for social networks, stating that these platforms pose a public health risk to young people. This warning could increase the momentum for Congress to pass the Kids Online Safety Act.
Meta has long struggled to attract and retain young users. After Snapchat overtook Instagram in 2016, features like Instagram Stories were introduced at the behest of Mark Zuckerberg. However, efforts to reduce the risks to young users have often fallen short. For example, in 2017, when Kevin Systrom requested more staff to mitigate harm to young users, Zuckerberg turned down the request, saying that Facebook faced bigger problems.
Internal documents reveal that Meta knew that four million children under the age of 13 were using Instagram. The company’s registration process, which allowed users to lie about their age, violated federal children’s online privacy law. The documents, revealed by Frances Haugen in 2021, showed that the company put profit ahead of safety. While Meta claims to have taken precautions for young users, in practice these measures fall short.
Instagram’s beauty filters have sparked internal debate over their negative impact on young people’s mental health. Introduced in 2017, the face-changing camera effects were intended to attract young users. However, the introduction of surgery-like filters in 2019 sparked concerns that they could increase body image disorders among young women.
Although Meta temporarily banned such filters, it considered lifting the ban due to competitive pressures. Mark Zuckerberg has stated that he finds it paternalistic to block the use of such filters. The company eventually announced that it was banning filters that promote cosmetic surgery, but these measures did not fully alleviate the pressure on the mental health of young users.
Meta’s neglect of child safety and the risks it exposes young users to by targeting them tarnishes the company’s reputation. Meta should abandon these practices that jeopardize the safety of our children and provide a safer online experience for young users. What do you think? Please don’t forget to share your thoughts with us in the comments section below.