Facebook is a public health hazard.
What Frances Haugen’s testimony tells us about what should come next.
On Tuesday, former Facebook data scientist Frances Haugen testified before a Senate subcommittee, putting words to what we all knew but couldn’t quite articulate: Facebook is the modern-day Philip Morris, the tobacco giant that knowingly and intentionally advertised and sold cigarettes to children. Indeed, harrowing accounts portray Facebook’s algorithm leading struggling teens down dangerous rabbit holes, promoting eating disorders and self-injury.
In simple, understandable terms, Haugen broke down the ways that Facebook’s algorithm intentionally feeds people content that prioritizes “meaningful social interactions,” which is fancy corporate talk for the most inflammatory, divisive, dangerous material from people inside your network. She broke down how Facebook targets young people — and how its own research demonstrated how their engagement with platforms like Instagram both drive addiction and damage their mental health.
For those who were with us at The Incision’s inception, you’ll remember that our very first post cut into the ways social media has driven disinformation and so much of modern society’s discontents. Haugen’s testimony brought back to public view why and how Facebook needs to be regulated. Here’s what it’ll take to get there.
We need a new crop of legislators — and a new agency.
One of the ongoing subplots of the effort to regulate Facebook and Big Tech is the fact that the elected officials responsible for regulation lack a basic understanding of Big Tech — or any tech at all.
Just last week, the chairperson of the subcommittee to which Haugen testified asked a Facebook executive, “Will you commit to ending ‘finsta?’” For the uninitiated, Finsta is an internet slang term for a fake Instagram account meant only for close friends. Beyond the gauche usage of a term rarely spoken by septuagenarians, it portrayed a clear lack of what the problem really is. It recalls a moment in 2018 when Republican Senator Orrin Hatch asked Facebook CEO Mark Zuckerberg, “So, how do you sustain a business model in which users don’t pay for your service?” Zuckerberg deadpanned, “Senator, we run ads.”
More generally, as the challenges facing 21st century America accelerate, Congress simply needs more people who came of age in this century. Our system of politics, which tends toward ossification, concentrates power in the status quo, necessarily older and further out of touch.
That said, they’re learning. The questions Democrat Sen. Blumenthal and Republican Sen. Marsha Blackburn, ranking member of the subcommittee, asked were thoughtful, articulate, and elicited important information from Haugen. It’s clear they’ve hired a crack team of staffers … who actually use Instagram.
The challenges with Big Tech aren’t going anywhere. Facebook has blown the lid off Pandora’s Box. And given how ill-equipped our legislators are to deal with it, we need a new arm of government, akin to the Consumer Financial Protection Bureau or the Food and Drug Administration. The Federal Communications Commission, which should ostensibly have oversight power over Big Tech, was founded way back in 1934 to regulate radio. Needless to say, telecommunications have evolved since then. If regulation were left to the FCC, it would need a massive retool in scope, size, and capacity to oversee some of the world’s biggest and most powerful corporations effectively.
We need to differentiate between regulating Facebook’s antisocial behavior and its monopolistic practices — and solve both.
Facebook is, by far, the world’s largest social media company, comprising Facebook itself, as well as Messenger, Instagram, and WhatsApp. Facebook didn’t build the latter two companies, it simply bought them in an effort to stem their competition.
And those are just the competitors it’s purchased. They’ve attempted to copy others, like Snapchat and TikTok, out of existence. Indeed, Facebook is also working on a clone of the new popular audio-based social network Clubhouse.
This has prompted a growing call to “break up” Facebook, championed by former Facebook co-founder Chris Hughes in an op-ed in the New York Times.
Coming out of Haugen’s testimony, some have argued that breaking up Facebook would simply limit our ability to regulate them effectively, creating multiple problematic social media companies out of one.
But this misses the point. We need to both regulate Facebook and break it up. One addresses the most blatant issue with Facebook’s behavior, and the other addresses the circumstances that created it. It is precisely because Facebook has used its market dominance to acquire, copy, and kill off its competitors that it hasn’t faced any market pressure to change. There are so few alternatives, after all. Perhaps advertisers would choose to advertise on a friendlier platform if others with the same reach even existed. The problem is they don’t.
There’s another aspect to this. When Facebook went dark for six hours on Monday, thousands of small businesses that rely on its tools did too. Facebook’s size and scale coupled with their monopolistic behavior has left little alternative to companies that rely on social media to do business. Their outage this week showed just how crippling that can be.
Government regulation won’t be enough. We also need a consumer-led movement.
There’s a ton that the federal government can do to hold Facebook accountable for its behaviors. But it won’t happen until we change the dynamics of its business model.
Again, the history of cigarette regulation is helpful here. The tobacco industry has faced regulation at almost every level of government, from limitations on where it can sell or advertise its products, to requirements to put public health warnings on its packaging, and sin taxes that raise the price of its products. They’ve been forced to pay out millions of dollars in damages in class action lawsuits.
But while those regulatory and legal actions helped curb their power, the most lasting gut-punch has come from consumers across the country quitting smoking or never picking it up in the first place. Smoking rates have plummeted because of the public recognition that smoking is dangerous — and that Philip Morris and its counterparts are selling poison.
If we are to truly curb the power of Facebook, it's going to come from all of us withholding our eyeballs that they so efficiently monetize until they commit to being better. It’s going to take holding advertisers accountable for choosing to advertise with Facebook, and calling on them to put their advertising dollars elsewhere so long as Facebook does not change. This kind of consumer-led movement often precedes government action, creating an implicit permission structure for it.
Of course this isn’t as straightforward as it is with a cigarette company whose products offer no benefit, only harm, and profit directly from the consumption of their product. Thousands of companies rely on Facebook every day. And much of the content shared on Facebook truly does bring us together. But that’s exactly why this approach could be so powerful: Facebook has an alternative. Rather than being a troll cesspool that makes people feel bad about themselves and society, it actually can choose otherwise and remain extremely profitable.
It’s not that we need no Facebook — we just need a better Facebook. And to get there, perhaps you’ll give this post a like and a share?
Abdul nothing to add . As usual your analysis and presentation are spot on. For your information as I often do with your shared blogs I send them out on Twitter and people are glad to receive.