Go back

Cambridge Analytica – The True Implications for the Future of Democracy

Published 30 Apr 2018
Michael Nguyen

As Cambridge Analytica is placed under the spotlight concern and attention has been focused largely on the issues of privacy. However, the true core of the issue is not that a third party held the private data of millions of individuals, but that they created a psychological warfare weapon capable of potentially threatening democracy.

As Facebook grapples with the collateral damage of recent scandals, public attention has also turned towards Cambridge Analytica, a London based data analytics firm whose participation in the US presidential election and Brexit vote of 2016 have finally galvanised concern about the disruptive effects of technology on democracy. Referred to as the new “Psychological Warfare Weapon,” whistle-blower Chris Wylie revealed how by collating the data of Facebook users, Cambridge Analytica was able to target voters with specific advertising designed to influence their political leanings through a strategy called psychographic marketing.

Targeting voters through the use of data is nothing new. Indeed, Obama earned a remarkable reputation for his revolutionary use of social media during his electoral campaigns. By using a data analytics machine called “Ada”, the same that was subsequently used by Hillary Clinton in 2016, Obama was able to identify specific groups of voters which volunteers then targeted through the traditional “knock and call” tactics.

The distinguishing factor between this strategy and that of Cambridge Analytica’s exists in the technological framework. The Clinton/Obama campaigns focused on relatively traditional electoral variables such as the demographics of voters but harnessed technology to collate this information on a much greater scale. Comparatively, the psychographic targeting used by Cambridge Analytica involves mapping out an individual’s personality by processing their online behaviour through machine algorithms. Each psychographic profile therefore measures the “Big Five” personality points: agreeableness, neuroticism, openness, extroversion and conscientiousness. It is with these measurements that that the company is then able to accurately predict the decisions and behaviours of each profile. The process itself is described as remarkably simple. It is acquiring the vast amounts of data necessary to build these psychological profiles that is the critical element.

Accordingly, this is where Facebook’s importance becomes apparent. It is easy to forget that Facebook’s product is not the content on its website, none of which it produces. Its product is its users, their data, their information, all of which can be sold to third parties for commercial marketing enterprises or, like in the case of Cambridge Analytica, political warfare. Data has appropriately become the new oil. That is why Cambridge Analytica paid Alexander Kogan $800,000 for the data that his seemingly innocuous app had collected.

With these profiles Cambridge Analytica then simply used the same platform that Facebook had allowed for commercial enterprises, targeted advertising for each individual user. However, instead of advertisements for books, shoes or other material items, Cambridge Analytica’s ads were designed for the purpose of exploiting these “Big Five” personality points to eventually influence and change the political opinions of each individual user. Playing on the inner most fears, concerns and anxieties of an individual, Cambridge Analytica and its clients were then able to capitalise accordingly, targeting these pressure with political messages that would, in theory, incite changes in political orientation. Unsurprisingly, this apparent ability to manipulate an individual’s thought process has incited the idea that the Orwellian model is finally possible.

Yet ironically, data appears to suggest the contrary. Though it is evidenced that Cambridge Analytica was able to accurately predict the behaviour of individuals, manipulating and influencing this behaviour toward a particular orientation is another challenge in and of itself. Whilst such techniques have been seen to work with commercial strategies, the effectiveness of changing political orientation is dubious at best. Recent studies have found any attempts to persuade voters of a contrasting political opinion through any form of advertising is ‘near zero’ and that instead political orientation is almost entirely influenced by the interpersonal environment of friends and family.

However, just because the viability of a “Psychological Warfare Weapon” is not yet possible, does not assure that it will remain so. Data is continuously improving in depth and utility. As such it would be foolish to dismiss the potential that it may soon be able to map out and manipulate the human psyche.

Nevertheless, whilst this particular threat has yet to manifest, society’s habit of dismissing these technological issues represents a danger in and of itself. Despite concerns of privacy, disinformation and even Cambridge Analytica’s activities being repeatedly exposed over the last few years, democracies have consistently been complacent in their response. The responsibility and culpability of these disruptive threats is not upon the technological companies like Facebook or Cambridge Analytica, but on states and society itself. Their unwillingness to respond accordingly with the transformative changes society is undergoing dictates that, should this continue, the psychological warfare tool developed by Cambridge Analytica is undoubtedly only the tip of a technological iceberg ready to sink democracy.

Michael Nguyen is an intern with the Australian Institute of International Affairs NSW. He is in his fifth year of a double degree in Arts (Political Science) and Law at the University of New South Wales. He has previously worked with the Brien Holden Vision Institute, as a lobbyist for the Labor Environment Action Network and for the Red Cross, both as President of its UNSW Society and as a logistics coordinator at its Sydney office. He currently works as a research assistant for UNSW and serves as an Army Officer reservist . His other interests include cyber-warfare, defence policy and international development.