The Guardian Australia’s Transparency Project has published research on how clandestine actors use the social network Facebook to run disinformation campaigns (podcast). Facebook has also been able to monetise Islamophobia pages. The Australian Labor Party suggests that Facebook and other social media giants could be broken up as an antitrust measure to halt or to prevent misinformation.
In his new book Mindf*ck: Inside Cambridge Analytica’s Plot to Break The World (London: Profile Trade, 2019) its former director of research Christopher Wylie poses an urgent dilemma for researchers:
Fresh out of university, I had taken a job at a London firm called SCL Group, which was supplying the UK Ministry of Defence and NATIO armies with expertise in information operations. After western militaries werre grappling with how to tackle radicalisation online, the firm wanted me to help build a team of data scientists to create new tools to identify and combat extremism online. It was fascinating, challenging and exciting all at once. We were about to break new ground for the cyber defences of Britain, America and their allies and confront bubbling insurgencies of radical extremism with data, algorithms and targeted narratives online. But through a chain of events that unfolded in 2014, a billionaire acquired our project in order to build his own radicalised insurgency in America. Cambridge Analytica, a company few had ever heard of, a company that weaponised research in psychological profiling, managed to turn the world upside down.(Loc 77-83, Kindle edition, emphasis added.)
At the outset of his tell-all memoir Wylie has identified a significant problem for researchers in industry-led projects: a change in funder can mean a new direction for an established research program. This is especially problematic with ‘dual use’ research that is at first created for defensive purposes – yet that also may have offensive applications. This is the Cambridge Analytica dilemma.
Many university research offices deal with this dilemma through a range of means. Ethics committees and institutional review boards oversee and ‘greenlight’ individual projects. Discipline and field-based norms can also act as powerful group-based socialisation mechanisms for new researchers. Export controls can prevent the proliferation of sensitive intellectual property and technologies. Other mechanisms also exist such as compartmentalisation and access-restricted budgets that are on a ‘need to know’ basis.
These institutional safeguards may all collectively fail when your funder is Renaissance Technologies’ hedge fund maven Robert Mercer. Cambridge Analytica and Mercer used Wylie’s research about combatting violent extremism to instead weaponise political subcultures, and to mobilise voter blocks in the 2016 United States campaign that elected President Donald Trump. In doing so, they highlighted how private funders can change the direction, the scope, and the norms of use for industry-led research.
The Cambridge Analytica dilemma points to a further, urgent challenge. The United States-based John F. Kennedy Special Warfare Center originally conceptualised psychological operations (PSYOP) – now called military information support operations (MISO) – for counterinsurgency and unconventional warfare. What happens when this PSYOP or MISO expertise migrates online to be mobilised by oligarchical wealth to covertly influence, shape or experiment with mass behaviour? How does this change the relationship between the elite few and the populace many? What possible counter-measures may exist? What new safeguards may be needed?
We’ll find out more in the United Kingdom in December, and in the United States in 2020. I’ll be watching developments closely.