Humans logo

Wake Up Before They Build the Next Censorship Frontier

“The War on Free Speech Isn’t About Safety—It’s About Control.”

By CoreyPublished 5 months ago 5 min read

There’s a growing feeling in the air: something in our culture has shifted, and not by accident. If you’ve ever caught yourself thinking, “Why does everything feel staged? Who benefits from all this division?” — you’re not alone. For people who are only just waking up (or who still think they’re comfortably asleep), this is a straightforward warning: a powerful mix of tech, media, money, and politics is shaping what you see, how you feel, and what counts as “acceptable” thought. If that sounds dramatic — good. Wake the fuck up. The cost of passivity is much higher than your comfort.

Not a conspiracy — a system of incentives

This isn’t a single cabal whispering orders in a dark room. It’s worse in one way and better in another: it’s a set of converging incentives. Big tech wants attention. Corporations want brand safety and predictable markets. Governments want stability and compliant citizens. Media organizations want clicks and sponsors. Activist ecosystems want influence and funding. When all these incentives line up, the result looks coordinated — because outcomes repeat: the same framings, the same taboos, the same narratives.

Why does that matter? Because when multiple powerful players independently prefer the same narratives, dissenting voices get sidelined, not necessarily through dramatic censorship in every case, but through algorithms that bury, moderation decisions driven by risk management, and cultural pressure that privately silences people who don’t toe the line.

How algorithms and outrage profit off us

The modern attention economy isn’t neutral. Algorithms learn what keeps us staring, clicking, and sharing. Outrage and moral certitude consistently win. That means identity-based narratives — those clean, tribal stories that explain who’s “good” and who’s “bad” — travel faster than nuance. Memes and snappy takes replace context. The net effect is a social environment where complex issues are flattened, and tribal pressure punishes curiosity.

Platforms aren’t neutral referees here. They’re businesses designed to maximize engagement. That makes them incredibly susceptible to feedback loops that privilege the loudest, angriest, most shareable content. Welcome to modern cultural engineering — not necessarily planned top-down, but massively effective all the same.

Why “they” don’t need to be evil to be dangerous

If you’re waiting for a villain with a monogrammed cape, you’ll be disappointed. The risk comes from systems that optimize for narrow goals: engagement, ad revenue, brand safety, or political stability. Each decision made to protect one of those goals shifts the public square. Remove a few voices labeled “misinformation,” tighten a policy to appease advertisers, tweak the algorithm to reduce noise — and suddenly large swathes of discourse feel off-limits. That metastasizes into self-censorship: people stop speaking to avoid being cancelled, shadowbanned, or losing career opportunities.

And when the system starts moving in one direction — say, toward favoring certain identity framings — the backlash fuels radicalization on the other side. That’s the spiral no one wants: more censorship leads to louder reactionary identity politics, which in turn justifies further control. It’s a cycle that ends with fewer voices, more extremes, and less trust.

When “Protecting Kids” Means Controlling Adults

Look at what’s unfolding globally right now.

In the UK, the Online Safety Act was sold as a shield against harmful content and a way to “protect children.” Buried in the details are sweeping powers for regulators to force platforms to verify age — effectively requiring ID checks for ordinary people just to access parts of the internet.

In Australia, lawmakers are preparing similar digital ID requirements for social media accounts, again under the banner of child protection. On paper it sounds noble. In practice, it means that anonymity — one of the last protections ordinary citizens have from surveillance, retaliation, or doxxing — gets dismantled.

In Canada, Bill C-11 and related measures give government agencies power to influence what content is surfaced to citizens, effectively making Ottawa a gatekeeper of discoverability online.

In the European Union, the Digital Services Act already forces platforms to hand over data and comply with state directives on “harmful” or “illegal” speech — categories that shift depending on who’s in charge.

Let’s be clear: governments are not banning or restricting social media to “save the children.” They are creating a precedent for total identity-linked internet use. Once you attach your passport or driver’s license to your online life, every comment, every search, and every “like” becomes traceable. If they don’t like what you say? They won’t just silence your post — they can silence you.

What you can do — practical, non-nihilistic steps

If you care about keeping a free, genuinely pluralistic public square, do more than complain. Here are concrete moves anyone can make:

Diversify your media diet. Subscribe to outlets across the spectrum, follow long-form journalists, and go to primary sources (court documents, academic papers, original interviews). Algorithms don’t get to curate your reality alone.

Check incentives. When you see a viral narrative, ask who benefits if it’s true, and who benefits if it’s false. Follow the money and the institutional incentives before you take the headline as gospel.

Support independent journalism. Pay for reporting that isn’t ad-driven. Subscription-funded outlets and nonprofit investigative teams do the deep work that algorithms destroy.

Use privacy and decentralization tools. Small tech moves — better privacy settings, alternative social platforms, and decentralized tools — reduce your dependence on a few corporate gatekeepers.

Build local epistemic communities. Meet with people in real life. Create spaces where disagreement is allowed and curiosity is rewarded. Algorithmic echo chambers don’t thrive when you have accountable, face-to-face networks.

Demand transparency and appeal mechanisms. Push platforms and policymakers for clear rules, consistent enforcement, and independent audits. If rules exist, they should be public and reviewable — not secretive.

Speak, but speak well

Anger is honest. So is urgency. But to win minds — especially those who are easily offended or still comfortably asleep — your message needs to be both fierce and clear. Avoid lazy scapegoating. Focus on systems, incentives, and practical fixes. When you point to the machinery and offer solutions, you move people from righteous indignation to action.

Final thought

We aren’t going back to a golden age where every idea could be aired and judged calmly — if that age ever existed. What we can do is slow the trend toward consolidated control and manufactured consent. That requires attention, skepticism, and effort. If you’re reading this, consider it a call: don’t outsource your thinking. Don’t let strangers on a feed tell you which truths are allowed. Wake the fuck up. The future of free thought depends on it.

humanitysocial media

About the Creator

Corey

Curious thinker, observer of the human condition.

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2026 Creatd, Inc. All Rights Reserved.