The 2016 election and the political turmoil that followed marked a turning point in public understanding of influence operations. Regardless of one’s view of the Steele dossier itself, the period that followed revealed a key insight through investigations and reporting: foreign actors were not merely trying to support a single ideological camp. They were often trying to inflame all camps at once.

The goal wasn’t persuasion so much as destabilization: amplify anger, deepen mistrust, and make democratic societies feel ungovernable. One of the clearest lessons from that era is that influence campaigns thrive not by inventing divisions from scratch, but by exploiting divisions that already exist.
The Power of Microtargeting
Modern platforms allow advertisers and bad actors to target audiences with astonishing precision.
Not “the public”, but:
- young men angry about cultural change
- communities fearful about safety
- activists outraged by injustice
- people primed for conspiracy narratives
- voters who feel ignored
This is where social-media influence becomes uniquely dangerous. Unlike a TV broadcast, microtargeted content is often invisible to everyone except the recipient. Two people can live in the same city, scroll the same platform, and experience completely different political realities.
| Step | Example | Effect |
|---|---|---|
| Data Collection | Watches a video about housing crisis | Platform knows you care about housing |
| Audience Segmentation | Targeted as “concerned citizen” | Receives posts about political outrage |
| Content Delivery | Angering meme or story | Generates outrage, engagement |
| Algorithmic Amplification | Boosts most engaging posts | Even small extremist posts get visibility |
| Societal Impact | Social division, distrust | People feel everyone else is against them |
Amplifying Extremes, Not Representing Majorities
One of the most corrosive effects of algorithmic targeting is that it rewards the loudest edges. Most people involved in social movements, whether pro-Palestinian activism, conservative populism, or progressive causes, are not extremists. But influence actors don’t need majorities. They need amplification.
A small number of incendiary voices, boosted through engagement algorithms or coordinated networks, can distort the perception of an entire cause.
In the context of the Israel–Gaza war, for example, researchers have warned that online ecosystems can blur lines between legitimate protest and extremist propaganda, with fringe content sometimes elevated far beyond its actual support.
At the same time, far-right online spaces have been fertile ground for foreign narratives that frame Western democracies as corrupt, decadent, or collapsing — themes that conveniently align with authoritarian state messaging (research shows these communities amplify extreme content and identity-driven narratives; see Far‑Right Online Communities and Social Media Influence).
The tactic is not to make everyone “pro” anything.
It’s to make everyone distrust everyone else.
The Infrastructure of Influence: Platforms, Data, and Terms of Service
Influence campaigns are not only about content. They’re about infrastructure.
Platforms collect immense behavioural data:
- what you watch
- what you linger on
- what enrages you
- what makes you share
That data powers recommendation engines designed for engagement — but engagement is not truth. Engagement is often outrage.
And Terms of Service agreements frequently grant platforms sweeping rights to process, analyze, and reuse user-generated content. Even when content is private or encrypted, metadata and behavioural signals remain powerful.
The result is an ecosystem where:
- emotions are measurable
- audiences are targetable
- outrage is profitable
- division is scalable
Foreign actors don’t need to “hack” society when society’s attention systems can be nudged from within.
Why Canada Isn’t Immune
Canadians often view foreign interference as an American problem. But Canada’s social fabric is just as vulnerable to wedge politics: immigration, Indigenous reconciliation, housing, identity, war, and trust in institutions.
And Canada’s digital life runs through the same platforms, the same algorithms, and the same opaque advertising systems.
The question is no longer whether influence operations exist.
The question is whether democracies are willing to regulate the systems that make them so effective.
A Simple Rule for the Scroll Era
The most dangerous influence content rarely announces itself. It doesn’t arrive wearing a foreign flag. It arrives as a post that feels tailor-made to make you furious.
If a piece of content makes you feel immediate rage, certainty, or contempt, you should pause.
That emotional spike is often the point.
Because in the age of targeting, division is not an accident of the internet.
It’s a business model. And sometimes, a weapon.
