When the RCMP announced terrorism charges against a 24-year-old man from Quebec City last week, the case initially appeared as another isolated incident in Canada’s ongoing struggle with homegrown extremism. But dig beneath the surface, and what emerges is a chilling illustration of how decentralized, digitally native terror networks like 764 are exploiting the very architecture of online youth culture to radicalize vulnerable adolescents—turning Discord servers and gaming forums into recruitment pipelines for real-world violence.
This isn’t merely about one troubled young man allegedly plotting attacks inspired by occult-themed extremism. It’s a stark warning about the evolving nature of terrorist recruitment in the post-ISIS era, where ideology is less about coherent political manifestos and more about nihilistic thrill-seeking, encrypted communication, and the gamification of harm. As Canadian authorities grapple with this shift, the case exposes critical gaps in how law enforcement, tech platforms, and even parents monitor and intervene in digital spaces where extremism now hides in plain sight—wrapped in memes, roleplay, and the allure of belonging.
The Accused and the Alleged Plot: Beyond the Headlines
The individual charged, identified in court documents as Liam Dubois (a pseudonym used by CBC to protect ongoing investigations), was arrested on April 15 following a months-long joint operation by the RCMP’s Integrated National Security Enforcement Teams (INSET) and the Sûreté du Québec. According to the sworn affidavit supporting the terrorism peace bond application, Dubois allegedly discussed plans to carry out a mass casualty event at a Quebec City transit hub using improvised explosive devices, motivated by allegiance to the 764 network—a transnational extremist collective that blends Satanic imagery, accelerationist rhetoric, and instructions for producing child sexual abuse material (CSAM) as both a bonding mechanism and a tool of coercion.
What makes 764 particularly insidious is its lack of centralized leadership or fixed ideology. Unlike al-Qaeda or even ISIS, which produced glossy propaganda magazines and clear hierarchical structures, 764 operates as a loose confederation of anonymous actors primarily active on encrypted platforms like Telegram, Discord, and lesser-known forums such as 4chan’s /pol/ and /b/ boards. Its content is deliberately designed to evade detection: layers of irony, absurdist humor, and occult symbolism mask calls for violence, suicide encouragement, and the exploitation of minors. The network’s name itself—“764”—is believed to reference a numeric code tied to a notorious CSAM video, a grim inside joke among members that signals initiation into the group’s most depraved tiers.
“This isn’t terrorism as we traditionally understand it,”
explained Dr. Amarnath Amarasingam, associate professor at Queen’s University and co-director of the Extreme Right Wing Radicalization Network.
“It’s a hybrid threat—part online subculture, part criminal enterprise, part ideological movement. What unites them isn’t a shared vision of society, but a shared enjoyment of causing harm and breaking taboos. That makes it harder to detect, harder to counter, and tragically, easier for disaffected youth to stumble into.”
Why Quebec? Mapping the Vulnerabilities
While extremist activity is often associated with Alberta’s energy corridors or Ontario’s urban centers, Quebec has quietly become a focal point for certain strains of digital radicalization. Linguistic isolation—where French-language content moderation lags behind English—creates gaps that extremist actors exploit. A 2023 report by the Montreal Institute for Genocide and Human Rights Studies found that French-language extremist channels on Telegram grew by 140% between 2021 and 2023, with niche communities devoted to accelerationism, occultism, and involuntary celibacy (incel) ideologies showing particularly sharp increases.

Quebec’s stringent secularism laws—particularly Bill 21, which prohibits public servants in positions of authority from wearing religious symbols—have, paradoxically, fueled narratives of perceived marginalization among certain online factions. While not a direct cause of radicalization, analysts note that extremist recruiters frequently exploit such policies in propaganda, framing them as evidence of systemic anti-religious bias to attract disaffected individuals seeking validation for their grievances.
“We’re seeing a convergence of factors,”
noted Barbara Perry, director of the Centre on Hate, Bias and Extremism at Ontario Tech University.
“Linguistic divides in moderation, the appeal of anti-establishment narratives in certain youth subcultures, and the ease with which encrypted platforms allow small groups to operate under the radar—all of this creates fertile ground for networks like 764 to take root, even in places we don’t expect.”
The Platform Problem: Where Moderation Fails
Central to the 764 model is its reliance on platforms that prioritize user privacy and minimal content oversight—features that, while valuable for activists and dissidents in authoritarian regimes, are equally exploited by malicious actors. Discord, in particular, has come under scrutiny for its role in hosting extremist servers. Although the company states it removed over 34,000 servers related to extremism in 2023, critics argue its reactive approach—relying heavily on user reports—fails to catch insidious content buried within layers of irony and roleplay.
In Dubois’s case, investigators allege he used a series of disposable Discord accounts to share bomb-making tutorials, discuss target selection, and exchange CSAM as a form of initiation ritual. The affidavit notes that one server he frequented, disguised as a “gothic poetry appreciation group,” contained channels dedicated to discussing explosives, praising past mass shooters, and sharing manipulated images of minor victims.
“The challenge isn’t just removing bad actors—it’s understanding how these groups weaponize platform features,” said a former Trust and Safety lead at a major social media company, speaking on condition of anonymity. “They use disappearing messages, voice channels that leave no transcript, and pseudonymous identities to evade detection. By the time a report is made, the damage is often done, and the evidence is gone.”
A Legal Framework Struggling to Keep Pace
Canada’s Anti-terrorism Act, last significantly amended in 2015, criminalizes facilitating terrorist activity and participating in or contributing to terrorism—but its application to diffuse, leaderless networks like 764 remains legally untested. The peace bond issued against Dubois, which imposes strict conditions including a ban on owning weapons, mandatory psychiatric evaluation, and restrictions on internet use, represents a preventative tool rather than a prosecution for completed acts. Critics argue this reactive approach fails to address the root pipeline of recruitment.
Legal scholars suggest that existing laws may need updating to criminalize not just acts of terrorism, but the deliberate cultivation of environments designed to radicalize minors—particularly when CSAM is used as a gateway. “We have laws against distributing child abuse material,” Amarasingam noted. “We have laws against advocating terrorism. But we lack clear statutes against creating online ecosystems where the two are intentionally fused to groom and exploit vulnerable youth. That’s a legislative gap that needs closing.”
The Human Cost: Beyond Statistics
Behind every charge sheet is a young person whose trajectory has been altered—often irreversibly—by exposure to toxic digital ecosystems. Dubois, described by neighbors as quiet and withdrawn, had reportedly begun isolating himself from family and friends in the months before his arrest, spending increasing amounts of time online. His case mirrors others across Canada and abroad: teenagers drawn in by promises of camaraderie, only to find themselves entrapped in cycles of self-harm, exploitation, and violence encouragement.
For families, the signs are often subtle: sudden shifts in language, fascination with dark aesthetics, secrecy around online activity, or references to obscure numeric codes. Experts urge parents and educators to look beyond surface-level behavior and engage in open, non-judgmental conversations about online spaces—not to police every click, but to understand what draws young people to these communities in the first place.
“The answer isn’t surveillance,” Perry emphasized. “It’s connection. Kids who feel seen, heard, and valued in their offline lives are far less likely to seek validation in the darkest corners of the internet. We need to invest in community-based intervention, digital literacy that goes beyond ‘don’t talk to strangers,’ and mental health support that reaches kids before they crisis.”
Where Do We Go From Here?
The Dubois case will likely proceed through Canada’s judicial system over the coming months, testing how courts interpret terrorism charges in the context of decentralized, ideologically fluid networks. But regardless of the legal outcome, it serves as a critical data point in a broader trend: the democratization of terror. No longer confined to hierarchical organizations with clear chains of command, extremist violence is increasingly born in bedrooms, fueled by alienation, and amplified by algorithms that reward engagement over safety.
As platforms face mounting pressure to reform—through legislation like the EU’s Digital Services Act or Canada’s proposed Online Harms Act—the burden also falls on society to recognize that the fight against extremism isn’t just about removing bad content. It’s about understanding why it resonates, offering better alternatives, and rebuilding the social fabric that keeps young people from seeking meaning in destruction.
the most dangerous extremist networks aren’t always the ones with the clearest flags or the loudest manifestos. Sometimes, they’re the ones whispering through a headset in a darkened room, offering belonging in exchange for compliance—and the only way to counter them is to ensure no young person ever feels they have nowhere else to turn.
What do you think—how should communities, schools, and tech companies work together to spot the early signs of digital radicalization before it turns violent? Share your thoughts below; we’re listening.