Facebook didn’t just give people a platform to share their opinions—it reprogrammed them to believe that their opinions mattered more than they actually do. Before social media, most people understood that their personal thoughts on politics, science, or culture weren’t necessarily worth broadcasting to the world. But Facebook changed that by gamifying validation and distorting the way people perceive expertise, discourse, and their own intellectual weight.
1. The Like Button and the Instant Gratification Loop
Facebook’s biggest trick was introducing the Like button—a simple feature that turned opinions into currency. Suddenly, a half-baked thought about politics or a conspiracy theory about vaccines wasn’t just a passing remark—it was something that could earn engagement, approval, and social status. People started associating being right with being popular, even when they weren’t actually saying anything insightful.
The algorithm fed this by rewarding outrage and certainty. A nuanced take? Ignored. A loud, divisive, simplistic opinion? Boosted. Over time, people learned that strong opinions—no matter how uninformed—got more attention than thoughtful ones.
2. The Death of Expertise
Before social media, there was a natural understanding that some people knew more than others about certain topics. Doctors knew more about medicine. Journalists knew more about current events. Academics knew more about history.
Facebook blurred those lines. By putting a Harvard epidemiologist’s analysis of vaccines next to a high school dropout’s conspiracy rant, it made every opinion seem equally valid. Add in the way people could gather in echo chambers (Facebook Groups, algorithmic feeds), and suddenly, expertise wasn’t just devalued—it was actively resented.
When people got validation from their peers for their bad takes, they doubled down. Facebook encouraged this by showing people more of what they agreed with and less of what challenged them, so they felt increasingly confident in their misguided beliefs.
3. Turning Every Opinion into an Identity
Facebook didn’t just train people to overstate the value of their opinions—it made their opinions part of their identity. Instead of seeing ideas as things to be debated or adjusted over time, people began treating them as who they are.
- If you believed climate change wasn’t real, that wasn’t just your opinion—it was your team’s stance, and any challenge to it felt like a personal attack.
- If you were convinced of a political conspiracy, you weren’t just curious—you were part of a movement of “truth-seekers.”
- If you posted an anti-vaccine rant and got hundreds of likes, that wasn’t just engagement—it was proof that you were onto something.
This is how Facebook turned casual opinions into rigid, unshakable beliefs. People no longer changed their minds when presented with evidence because their opinions became tied to their social standing.
4. The Algorithm as an Opinion Accelerator
Facebook didn’t just show people what they liked—it amplified it. If you expressed a mild opinion, the algorithm would show you more extreme versions of it to keep you engaged.
- Clicked on one post about immigration? Here’s a flood of content about “illegals” ruining the country.
- Watched one video questioning vaccines? Here’s a pipeline straight into full-blown anti-vax conspiracy theories.
- Liked a post about free speech? Here’s a dozen posts claiming conservatives are the most oppressed group in America.
This feedback loop pushed people toward more extreme, confident, and self-important versions of themselves.
The End Result: A World Where Everyone Thinks They’re a Thought Leader
Facebook took normal people, inflated their sense of importance, and made them feel like public intellectuals without requiring them to know anything. It trained them to value attention over accuracy, to mistake engagement for expertise, and to see every disagreement as a war.
The result? A society where the loudest voices often belong to the least-informed people—because Facebook made them believe they were smarter, more insightful, and more important than they actually are.







