It didn’t take long for American authorities to recognize an unusual factor powering the rise of the Islamic State several years ago. While its ideology and violence may have seemed medieval, its recruitment was very 21st century, leveraging social media to glorify its work and assail its enemies.
When The Washington Post interviewed a number of defectors from the group in 2015, we found a common thread.
“(A)ll but one said their decisions to leave for Syria could be traced to videos they saw online, or encounters on social media, that ignited a jihadist impulse,” The Post’s Greg Miller and Souad Mekhennet wrote. “The only outlier said that he had been prodded by a friend to come to Syria and was promptly imprisoned for refusing to fight.”
The group made explicit appeals to young Muslim men who felt disaffected or disempowered. After declaring the formation of the caliphate that it hoped would establish a permanent Islamic State, ISIS leader Abu Bakr al-Baghdadi offered an appeal to the Muslim world. “Lift your heads up high,” he said in a speech. “You now have a state and a caliphate that restores your honor, your might, your rights and your sovereignty.”
The International Center for Counter-Terrorism described how the Islamic State presented itself to potential recruits. “In its black-and-white ideology it depicts its fighters as heroic defenders of the Muslim world — the ummah (community of Muslims) — against Western colonization and the domination of Arab and other Muslim lands by pro-Western Muslim rulers who are portrayed as puppets of the West,” a June 2015 paper read.
The United States put a concerted — and not always successful — focus on disrupting the group’s online outreach, encouraging tech companies to block access to material from the Islamic State. It was recognized that the internet was a central vector for spreading the militant group’s ideology and tactics, in part because those things spread internationally and led to Islamic State-inspired terrorist attacks in countries around the world.
“The media people are more important than the soldiers,” a defector told The Post in 2015. “Their monthly income is higher. They have better cars. They have the power to encourage those inside to fight and the power to bring more recruits to the Islamic State.”
On Friday afternoon local time, a heavily armed gunman walked into a mosque in Christchurch, New Zealand, and began shooting Muslim worshipers. A second mosque subsequently came under attack, contributing to a total death toll of at least 49. A Twitter account showed images of weapons covered with various inscriptions that appear to match weapons shown in a live stream of one of the mosque attacks. The same account linked to a lengthy document offering an intentionally confusing set of beliefs.
At the center of that document is a philosophy that J.J. MacNab of George Washington University’s Program on Extremism summarized succinctly: “8chan racist.”
If you’re not familiar, 8chan is a message-board site that facilitates unmoderated, anonymous, temporary conversation threads. If you’re familiar with 4chan, it’s similar — but less constrained. There’s a distinct language deployed by a distinct but sprawling community, and as Bellingcat’s Robert Evans writes, the document left was meant in part to appeal to that community.
“The entire manifesto is dotted, liberally, with references to memes and Internet in-jokes that only the extremely online would get,” Evans said, later adding that “(i)n addition to (sowing) discord and creating confusion, the Christchurch shooter’s repeated references to memes and in-jokes were him playing to this very specific crowd” on 8chan. (The document writer appears to have announced the attack on the site beforehand.)
The most prominent through-line, though, was racism — in case the targeted murder of scores of Muslims weren’t sufficient evidence of the shooter’s grotesque philosophy. The document, part of which is formatted as an FAQ, includes a mention of President Donald Trump. Did the shooter support the American president?
“As a symbol of renewed white identity and common purpose? Sure,” it reads. “As a policy maker and leader? Dear god no.”
The author draws an intentionally blurry line on that and most issues. Was he — assuming the author is a he — inspired by Trump? There’s no indication that he was. Did he see in Trump’s rhetoric echoes of his own views on the protection of whites? That appears to be his claim.
A focus on the purported dangers of immigration or the purported threat posed by Muslims or Jewish people is obviously a central aspect of this far-right ideology. The New Zealand shooter is only one example: The massacre of worshipers at a synagogue in Pittsburgh in October was rooted in the same line of thinking and allegedly committed by someone engaged in another largely unfiltered online community called Gab.
The document’s joking references for that apparent 8chan community is part of the appeal of those communities. Researcher Whitney Phillips released a report on the spread of extremist ideologies last year. She described how the community relies on the descriptor of “trolling” to blur lines between jokes and hate.
“The fact that the term is used to describe everything from Nazi violence to G-rated silliness makes it a perfect rhetorical vessel for media manipulation; the polysemy of trolling provides violent bigots, antagonists, and manipulators a cloaking device and built-in defense of plausible deniability,” Phillips wrote. “Both are encapsulated by the oft-lobbed response, ‘I was just trolling,’ which for many absolves online actors of any personal responsibility for the things they choose to say and do to others online.”
Racist ideas, often packaged as memes or jokes, are part of the “edgy” and “fun” that culture sites like 8chan and 4chan celebrate, making them ideal Petri dishes for hate groups — “a safe space for self-selecting misogynists and racists whose bigotries were an identity first, source of lulz second,” as Phillips puts it. “Lulz” is an in-group term for ironic jokes, derived from “LOL.”
White supremacist groups explicitly target recruits using the memes and fake-joking language of internet culture. The HuffPost obtained what it called a style guide used by one of the more prominent white nationalist sites, the Daily Stormer, which outlined how it targeted recruits to its ideology.
“Most people are not comfortable with material that comes off as vitriolic, raging, non-ironic hatred,” the document read. “The unindoctrinated should not be able to tell if we are joking or not.”
The section is called “Lulz.”
Who is targeted? Largely disaffected young men — young white men — who are fed rhetoric that empowers them by casting others as a threat to them and their cultures.
“We are dealing with angry, disaffected men, mostly White, who find purpose & community with these extremist groups who give them a hero’s narrative through violent ideology of White supremacy,” New York Times contributor Wajahat Ali wrote on Twitter after the New Zealand attacks. “It’s like White ISIS.”
It is. It’s the same playbook, using disempowerment and disaffection to leverage racist animosity. The Islamic State encourages “its fighters (to be) heroic defenders of the Muslim world”; 8chan racists encourage each other to defend white culture. Social media is used deliberately as a means for engagement. People around the world are looped into an informal culture and discussion. The message is received, and people engage in violent acts across the globe.
And afterward, they’re celebrated in the communities where their hate originated. Bellingcat’s Evans documents some of the response to the New Zealand attack on 8chan. In “page after page of posts,” he said, contributors “celebrated this mass murder by one of their own.”
You now have a site and meme that restores your honor, your might, your rights and your sovereignty.
Send questions/comments to the editors.
Comments are no longer available on this story