Skip to content
Link copied to clipboard

Mark Zuckerberg’s apologies won’t keep kids safe online | Editorial

Despite rare bipartisan support to do more to protect children, the federal government has repeatedly failed to enact meaningful change or hold tech companies accountable.

Mark Zuckerberg, chief executive officer of Meta, has been saying he’s sorry since 2007 for his company's many failures, writes the Editorial Board, but the only visible change has been a corporate rebranding.
Mark Zuckerberg, chief executive officer of Meta, has been saying he’s sorry since 2007 for his company's many failures, writes the Editorial Board, but the only visible change has been a corporate rebranding.Read moreKent Nishimura / Bloomberg

How many children must die before social media companies do something about their role in the sharp rise in suicides, eating disorders, depression, and bullying that has impacted so many young people?

How many more hearings will Congress hold before it acts?

Executives from Meta, the company behind Facebook and Instagram, have testified 33 times since 2017 on issues ranging from election interference and social media’s role in the insurrection on Jan. 6, 2021. Yet nothing has been done.

How many halfhearted apologies will Facebook founder Mark Zuckerberg offer before he actually does something about how his company harms children and damages democracy?

» READ MORE: Social media companies must curtail the spread of misinformation | Editorial

Zuckerberg’s apology to parents received most of the attention at a congressional hearing last week about the effects of social media on the lives of young people. But Zuckerberg’s mea culpa is meaningless until the billionaire takes responsibility for the damage his company has wrought.

The creator of Facebook has been saying he’s sorry since 2007 for the site’s failures to crack down on fake news, hate speech, and lax privacy controls. Through it all, the only visible change was the renaming of the site’s parent company from Facebook to Meta after the original name became so tarnished.

Zuckerberg has known for years that children using Facebook and Instagram have been frequent targets of sexual harassment. An internal company presentation in 2021 estimated that 100,000 minors a day received pornographic photos and other sexually abusive content.

Facebook’s own internal research found that young girls, in particular, blamed Instagram for increased rates of anxiety, depression, and suicidal thoughts. Those conclusions echoed the findings of other studies over the last 10 years.

In 2021, a former Facebook executive turned whistleblower said the company repeatedly puts profits above safety. A second whistleblower testified last year that the company has known its social media platforms harm children.

The heads of TikTok, X, Snap, and Discord also testified before Congress last week. Missing were the heads of YouTube and Apple, which have also been blamed for harming children.

Snap CEO Evan Spiegel also apologized to parents whose children died from fentanyl overdoses after buying the drugs through the platform. (Like Zuckerberg, this was not Spiegel’s first public apology.)

Despite rare bipartisan support to do more to protect children, the federal government has repeatedly failed to enact meaningful change, even after the surgeon general warned last year that social media presented a risk to the mental health of teens.

Dozens of past bills have floundered after lawmakers could not agree on the details and intense lobbying by the tech industry. Even the House committee that studied the Jan. 6 insurrection found extensive evidence regarding the role social media played in stoking extremism but did not include it in the final report.

Several red and blue states have introduced dozens of bills designed to rein in social media companies. But most measures have been blocked by courts. After Montana banned TikTok, a federal judge ruled the measure likely violated the First Amendment.

Federal regulation makes more sense than a patchwork of state bills. The European Union enacted a landmark measure last year that forced social media companies to combat misinformation. But more must be done to protect children.

» READ MORE: TikTok got me through the pandemic. Then its algorithm turned on me. | Opinion

As lawmakers in Washington fiddle, dozens of states — including Pennsylvania and New Jersey — are suing Meta, accusing Facebook and Instagram of fueling a youth mental health crisis.

School districts across the country, including in Pennsylvania, have also sued Instagram, Snapchat, TikTok, and YouTube for allegedly harming children.

Dozens of parents are suing Snapchat for enabling the sale of illegal drugs that led to the death of children. A judge allowed the case to move forward, but it remains to be seen how far it will proceed.

That’s because social media companies have long hidden behind a provision in the Communication Decency Act known as Section 230, which protects platforms from legal liability for things third parties say or do.

Lawmakers on the left and right have called for the repeal of Section 230. Such a move would upend the internet, and the U.S. Supreme Court recently dodged the issue.

But doing nothing is not a solution. At some point, social media platforms must be held accountable.