Have You Seen That Video on TikTok?
- jpaget54
- 23 hours ago
- 3 min read

Social media is the single most powerful mind-altering technology ever created.” – Jaron Lanier
We hear it everywhere: “Have you seen that video on TikTok?” The platform is no longer just hosting conversations it shapes them. A clip goes viral, a phrase spreads, and suddenly it isn’t just content, it’s grammar. TikTok has become a language we speak.
And within that language, hate has found fluency. The far right understand the rhythm of attention. Their misinformation is packaged as certainty: the reason life feels uncertain, the reason things are going wrong, is because of those people. Short, sharp, endlessly repeatable it slips easily into our speech.
Picture a teenager on the bus, headphones in, eyes fixed on the screen. In a five minute scroll they move from a dance trend to a football clip to a video telling them asylum seekers are to blame for housing shortages. By the time they get home, they repeat it as if it were common knowledge. What starts as a video becomes a reference, a worldview.

We have seen how words, repeated often enough, can prepare the ground for violence.
In Rwanda in 1994, the radio station RTLM was not just background noise. It was a steady drumbeat of propaganda. Neighbours were described as “cockroaches”. Lies repeated daily until they felt like truth. By the time the killing began, words had already stripped people of their humanity.
In Myanmar, decades later, Facebook played a different role. Posts and messages spread false rumours about the Rohingya that they were dangerous, that they did not belong. These lies moved so quickly and widely that they no longer felt like opinion; they felt like fact. Violence followed, justified by words people had already absorbed through their screens.
One case shows us how propaganda corrodes humanity over time. The other shows how speed and scale can turn rumour into fact overnight. Together, they remind us: media, when weaponised, does not remain abstract. It prepares minds. It shifts what feels normal. It turns prejudice into everyday speech and violence into the next logical step.
The UK Is Not Immune
We cannot imagine this only happens elsewhere. In recent riots across Britain, misinformation spread on TikTok and Telegram played a direct role in mobilising crowds. Rumours about asylum seekers circulated faster than police statements could correct them. By the time the truth emerged, the damage was done communities left shaken, neighbours divided, trust eroded.
Online hate is not “just words”. It creates the atmosphere in which violence feels justified. And we are already seeing the consequences in our own streets.

The Psychology of the Scroll
These platforms are not neutral. The endless scroll, the short burst of video, the ping of a notification none of it is accidental.
It is built on the same principles as casinos: intermittent rewards. You never know which swipe will deliver something shocking, funny, or enraging. And outrage spreads fastest of all. Neuroscience shows anger lights up the same reward pathways in the brain as pleasure.
But if an app is designed like a casino, what does that make us when we play it? Customers? Or players in a system designed to keep us hooked?
As the philosopher Marshall McLuhan warned, we do not simply use tools tools use us. When apps reward anger, division becomes profitable. When algorithms thrive on conflict, peace becomes invisible.
What Platforms Can Do
Social media companies cannot present themselves as passive hosts while profiting from division. Like any other industry, they should be accountable for the harm built into their design.
Some steps already exist:
Fact-checking at the point of contact – Meta works with independent fact-checkers, and TikTok has trialled labels on misleading election content.
Algorithmic transparency – YouTube has cut recommendations of extremist material by over 70% since 2019 by altering its systems.
De-amplifying harm – Facebook limits the reach of accounts that repeatedly spread misinformation.
Partnerships with experts – WhatsApp works with fact-checkers in countries like Brazil and India to break the chain of viral lies.
Safer spaces by design – LinkedIn’s proactive moderation and clear standards create calmer spaces than platforms like X, where hate has risen sharply as rules were loosened.
The contrast is clear: online climates are not accidents they are designed. If some platforms can reduce harm, then all can. And when they fail, they must be held accountable, just as any industry would be if its products caused predictable damage.

The Choice Ahead
Technology does not create hate, but it magnifies it. It accelerates it. It makes poison sound like common sense.
The feed is powerful, but not inevitable. It is a system of design and design can be changed.
The question is not whether platforms shape us. They already do. The question is: what kind of world do we allow them to design, and what kind of language do we agree to speak?
Because silence is also a choice.
Comments