7 min read

Break Up Facebook But Don't Expect Youth Mental Health To Rebound

Haugen's "love" for Facebook causes her to miss the bigger picture -- she wants the same solution as Zuckerberg
Break Up Facebook But Don't Expect Youth Mental Health To Rebound

Embattled social media network Facebook has been facing a deluge of criticism recently, which ties in nicely with the timely six or so hour-long blackout of the companies' websites on Monday, October 4. Many users across Twitter were celebrating their inability to access Facebook, and some media outlets even reported that the "master code" for Facebook had been deleted, prompting even further celebration. No clue how anyone thought it possible to delete a social media's "master code," and sure enough, Facebook and its subsidiaries WhatsApp and Instagram returned by day's end.


The outage contributed to approximately $164,000 lost revenue per minute. The stock declined almost 6% before ending the day at -4.87%, effectively wiping out $40 billion in market cap for the company and a personal loss of $6 billion for CEO Mark Zuckerberg. Not fantastic, but the other news that dropped Sunday night before the outage for Facebook may be a bit longer-lasting and more challenging to change the companies' poor narrative than the short blackout.


Francis Haugen revealed her identity on 60 minutes as the Facebook whistleblower behind a series of documents that she shared with regulators and the Wall Street Journal that detailed the awareness of Facebook on the impact it has on poor youth mental health.  The former Facebook product manager who worked on "civic integrity" issues at the social media company released documents showing Facebook's research that essentially proves the company knew its social media was toxic for teen girls.

Leaked research from Facebook

But, I also don't want to overstate the findings from Facebook's leaked research. Much of the research appears to have pretty small sample sizes, and the results were not wholly negative. Plenty of respondents reported that Instagram and Facebook made them feel better about themselves. The research seems as shoddy as most corporate-funded self-research, only this time the media pounced on the fact that Facebook, for the first time, at least admitted some awareness of its negative impact on teens. Facebook spokesman Andy Stone responded to the leaks:

"Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true."


Some other details emerged from the leaks, all of which you can read at the Wall Street Journal's Facebook Files hub, which includes stories about privileged accounts avoiding community guidelines, Facebook's algorithms sowing discord to increase engagement, and high usage of open drug cartels and human traffickers. However, Instagram's mental health effect on teenage girls has drawn the most coverage and was the impetus for the Senate subcommittee hearing on Tuesday, October 5.


Ms. Haugen appears to be somewhat self-serving or, at the very least ignorant of the more significant issue. She isn't a whistleblower afraid for her life; she is simply well-off, chose to work at several big tech companies, including Alphabet, and decided to release documents that prove Facebook is aware of some of its actions. Who cares? The world already knew Facebook was poor for teenagers' mental health. While still inconclusive, we have pretty damn good studies on the effect from 3rd party researchers like the following:

A meta-analytic review of the relationship between social media use and body image disturbance

Social networking site use and self-esteem: A meta-analytic review

Is social network site usage related to depression? A meta-analysis of Facebook–depression relations


Why does it matter that Facebook also knew of the damages? It doesn't. They could be ignorant of the problem, but that doesn't make it any less valid. It is a public relations ploy to get Facebook to change its business model. That is clear with the messaging from Haugen:


"I don't hate Facebook," she wrote. "I love Facebook. I want to save it."


Nobody that "loves" Facebook is in any position to fix the social ailments caused by its existence. The PR ploy will likely fail, as all other media relations schemes have failed against Facebook. Facebook is the best in the business at just ignoring awful media. Besides, it is not likely that any legislation is coming on the issue. It is absurd to expect bipartisan legislation while Democrats want more censorship of disinformation while Republicans want to spread disinformation. This case has received attention primarily for its role in side-stepping the usual political battle and making it about the children. Still, it most likely will not result in any policy change.


Regardless of the outcome, what is the solution to Haugen's issues with Facebook? Haugen testified before the Senate that Facebook's algorithms should not be protected by Section 230, which famously shields social media companies from being held liable for publishing disinformation. The rule essentially provides Facebook with immunity from liability for the information users post on its site. Her argument specifically was that Facebook should not be allowed to utilize "amplification algorithms" and "engagement-based ranking" that Haugen deemed the culprit behind the harmful content causing poor esteem among teens.
"Right now, the only people in the world trained to analyze these experiences are people who grew up inside of Facebook," Haugen said.


Essentially, Haugen's policy recommendation is the same as Facebook's; she wants a new regulatory office specific to social media. "Right now, the only people in the world trained to analyze these experiences are people who grew up inside of Facebook," Haugen said. You can see the same proposal from Facebook VP of Global Affairs back in May of 2021, where he calls for a "new digital regulator" for the space similar to the Federal Communications Commission. Facebook has been asking for more oversight for years now, making the highly pedestrian leaked records appear almost part of normal daily operations.


When have you ever heard a company speak so softly against a whistleblower as this: "We don't agree with her characterization of the many issues she testified about," wrote Pietsch, Director of Communications at Facebook. "Despite all this, we agree on one thing; it's time to begin to create standard rules for the internet. It's been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act."

The Data Protection Agency may be necessary and help in some ways (I don't think anyone defends current Facebook practices). Still, it is not the solution to the lack of accountability Facebook faces for its actions. That should be clear from the simple fact that Facebook wants the agency as well. Matt Stoller, a crusader against monopolization, has a good article on the potential dangers of such a "solution":

Yet the concentration of power in the hands of a small group is the fundamental political and economic problem with Facebook. We have never allowed one man to set rules for communication networks that structure the information ecosystem of billions of people. But that is the situation we're in. We have to radically decentralize this power. But a regulatory overlay in some ways would worsen the problem, because it would explicitly fuse political control with market power over speech and it would legitimize the dominant monopoly position of Facebook. -Stoller

I am somewhat sympathetic to bigness and centralization, but this argument is correct. Opening up a new agency to expose social media to even greater amounts of political influence and staffing it with the exact people that created the problem in the first place do not seem like extraordinary ideas. Seriously, the following line is either self-serving, ignorant of power dynamics, or both:

"Right now, the only people in the world trained to analyze these experiences are people who grew up inside of Facebook or other social media companies," said Haugen. "There needs to be a regulatory home where someone like me could do a tour of duty after working at a place like this," she said.


Stoller makes a good case for breaking up Facebook, but I think that is still probably short of a real impact on youth mental health. It would be a much further step than Haugen advocates, as it establishes the boundaries of power at play and helps limit the power of a giant company like Facebook. Facebook should have to compete with Instagram and WhatsApp, not just have the ability to purchase every competitor. The added competition from anti-monopolization policies would likely incentivize social media platforms to focus on privacy and child safety features to differentiate from "bad" social media companies. Those anti-monopolization practices should heal some of society's ailments by limiting the power of big tech.

Unfortunately, for many social media sites, the mental health issues and dysmorphia seem baked into the prevalent use of engagement tools; in other words, the system of likes, dislikes, and even simply using the platforms are the cause of the issue at hand.


Maybe Facebook could get rid of likes or engagement entirely, which could help teens focus less on trivial bullshit. I tend to agree with the Facebook in Haugen's head: "Facebook wants you to believe that the problems we're talking about are unsolvable. They want you to believe in false choices," Haugen said. It appears that Instagram may, most possibly, be causing some issues with society. Growing pains or not, it does not seem healthy to have kids growing up constantly socially engaged.  We have pretty good research on the deleterious effects of social media on teenagers that goes far beyond just the shallow claims of algorithm shaming.

Breaking up Facebook is a great idea that helps solve many other problems like the social media company's large amount of power held in America. Setting up a Data Protection Agency with former Facebook staff to specifically handle algorithms is incredibly insipid legislation Facebook wants and sets up a road to further political polarization. The difficulties that teenagers are facing are different than any prior generation has met. The constant socialization, lack of private life, and thus constant popularity contest is hardwired into Facebook, Twitter, and Instagram. It will take a drastic overhaul of the current social media landscape to lower the rapidly rising rates of anxiety and depression amongst US teens.