K. Ryan Proctor and Richard E. Niemeyer’s recent book Mechanistic Criminology (Routledge, 2019) develops mechanism-based schemas for several criminological theories: social learning, social control, and general strain. The book advances a neopositivist approach to theory-building, theory-testing and causal inference tests. Whilst I drew on cultural criminology understanding of terrorist subcultures and social learning mechanisms in my PhD this book provides new insights that will inform my post PhD research program.
In my PhD’s literature review I considered several different models of violent extremism and terrorism. Now, James Khalil, John Horgan and Martine Zeuthern have a new explanation:
Progress in understanding and responding to terrorism and violent extremism has continued to stall in part because we often fail to adequately conceptualize the problem. Perhaps most notably, much of our terminology (for instance, “radicalization”) and many variants of our existing models and analogies (including conveyor belts, staircases and pyramids) conflate sympathy for this violence with involvement in its creation. As its name suggests, the Attitudes-Behaviors Corrective (ABC) model seeks to overcome this issue by placing this key disconnect between attitudes and behaviors at its core. In this paper, we first present the key elements of our model, which include a graphic representation of this disconnect and a classification system of the drivers of violent extremism. The former enables us to track the trajectories of individuals in relation to both their attitudes and behaviors, while the latter helps ensure that we consider all potential explanations for these movements. We then adapt these elements to focus on exit from violence, applying the dual concepts of disengagement and deradicalization. Finally, we conclude with a section that aims to provide the research community and those tasked with preventing and countering violent extremism with practical benefits from the ABC model.
The article’s emphasis on separating out attitudes from behaviours will be quite important for deradicalisation and disengagement initiatives.
The RAND Corporation think tank has released a new report on Russia’s gray zone tactics in targeting international opinion:
Recent events in Crimea and the Donbass in eastern Ukraine have upended relations between Russia and the West, specifically the North Atlantic Treaty Organization (NATO) and the European Union (EU). Although Russia’s actions in Ukraine were, for the most part, acts of outright aggression, Russia has been aiming to destabilize both its “near abroad” — the former Soviet states except for the Baltics — and wider Europe through the use of ambiguous “gray zone” tactics. These tactics include everything from propaganda and disinformation to election interference and the incitement of violence.
To better understand where there are vulnerabilities to Russian gray zone tactics in Europe and how to effectively counter them, the RAND Corporation ran a series of war games. These games comprised a Russian (Red) team, which was tasked with expanding its influence and undermining NATO unity, competing against a European (Green) team and a U.S. (Blue) team, which were aiming to defend their allies from Red’s gray zone activities without provoking an outright war. In these games, the authors of this report observed patterns of behavior from the three teams that are broadly consistent with what has been observed in the real world. This report presents key insights from these games and from the research effort that informed them.
This is an interesting contemporary use of wargaming methodologies.
The New York Times Magazine recently profiled video editor Josh Owens who offered a revisionist take on Infowars.com’s conspiracy theory celebrity Alex Jones. Slate‘s Aymann Ismail was more sceptical of Owens’ subjugation claims. When I worked for the former Disinformation website its publisher sought (unsuccessfully) to negotiate a book deal with Jones, earlier on in his career. This poses an intriguing counterfactual: what if the ‘meme warfare’ subculture surrounding Jones and conspiracy theories had been more liberal than alt-right in nature? How would that have affected the 2016 United States election outcome?
Australian MP Andrew Hastie – Chair of the Parliamentary Joint Committee on Intelligence and Security – has called in a new paper published by the Henry Jackson Society and the Konrad Adenauer Stiftung think tanks for Australia to develop political warfare capabilities. Hastie contends that this is necessary for Australia to counter great and rising power competition (which is also a focus of ‘fourth generation’ scholarship on strategic culture) and its potential threats to Australian liberal democracy. In particular, Hastie focuses on the impact that sophisticated disinformation, misinformation, and hybrid warfare strategies can have. Hastie’s stance illustrates a military-informed, realist approach for Australia – and one that will lead to further academic, policymaker, and public debate.
The New Yorker‘s Joshua Yaffa has profiled Russian television producer Konstantin Ernst who as Channel One’s general director, is an important media and propaganda expert for the Putin Administration:
“Baldly false stories, in the right doses, are not disastrous for Channel One; in fact, they are an integral part of the Putin system’s postmodern approach to propaganda. In the Soviet era, the state pushed a coherent, if occasionally clumsy, narrative to convince the public of the official version of events. But private media ownership and widespread Internet access have made this impossible. Today, state outlets tell viewers what they are already inclined to believe, rather than try to convince them of what they can plainly see is untrue. At the same time, they release a cacophony of theories with the aim of nudging viewers toward believing nothing at all, or of making them so overwhelmed that they simply throw up their hands. Trying to ascertain the truth becomes a matter of guessing who benefits from a given narrative.”
On my reading list is historian Robert Service’s recent book Kremlin Winter: Russia and the Second Coming of Vladimir Putin (New York: Pan Macmillan, 2019). It’s important to contextualise the current debates on disinformation, misinformation, hybrid war, and political war in the context of Russia’s renewed power projection, and Russian strategic culture which uses asymmetric tactics. Service’s analysis of current history will help with my on-going research agenda.
The Guardian Australia’s Transparency Project has published research on how clandestine actors use the social network Facebook to run disinformation campaigns (podcast). Facebook has also been able to monetise Islamophobia pages. The Australian Labor Party suggests that Facebook and other social media giants could be broken up as an antitrust measure to halt or to prevent misinformation.
Richard Stengel was the Obama Administration’s Under Secretary for Public Diplomacy and Public Affairs. Stengel’s new book Information Wars: How We Lost The Global Battle Against Disinformation and What We Can Do About It (New York: Grove, 2019) is on my reading list for future research projects – in order to understand the United States Government response to disinformation and information warfare.
The Rand Corporation has recently released a directory of web tools to combat online disinformation:
The rise of the internet and the advent of social media have fundamentally changed the information ecosystem, giving the public direct access to more information than ever before. But it’s often nearly impossible to distinguish accurate information from low-quality or false content. This means that disinformation—false or intentionally misleading information that aims to achieve an economic or political goal—can become rampant, spreading further and faster online than it ever could in another format.
As part of its Countering Truth Decay initiative, and with support from the Hewlett Foundation, RAND is responding to this urgent problem. Our researchers identified and characterized the universe of online tools developed by nonprofits and civil society organizations to target online disinformation. These tools were created to help information consumers, researchers, and journalists navigate today’s challenging information environment.
I’ll make several observations on this useful collection of resources. Fighting disinformation has moved from the cultic milieu of the former Disinformation subculture search engine to policy think tanks. The election of United States President Donald Trump and the United Kingdom’s Brexit vote – both in 2016 – has created a secondary market in tools to counter online disinformation. The other thing evident from RAND’s list is the emphasis on gamification as a cognitive strategy to engage the public.