Skip to main content

Cambridge Analytica used our secrets for profit – the same data could be used for public good

Cam Ana Convo 920x540

By Professor William Watkin, Professor of Contemporary Philosophy and Literature, Brunel University London 

Ever since it was revealed that Cambridge Analytica had taken data from 87m users via a Facebook app <a href="https://www.theguardian.com/uk-news/2018/apr/13/revealed-aleksandr-kogan-collected-facebook-users-direct-messages">that exploited</a> the social media site’s privacy settings, it has been suggested that anything from Donald Trump’s election in the US to the European Union referendum result in the UK could have been the result of the persuasive power of targeted advertisements based on voter preferences.

But Aleksandr Kogan, the University of Cambridge researcher whose data-collecting app formed the basis for Cambridge Analytica’s subsequent work for various political groups, appeared to pour cold water on this idea when <a href="https://www.theguardian.com/technology/2018/jun/19/aleksandr-kogan-facebook-cambridge-analytica-senate-testimony">speaking to a US Senate committee</a>. “The data is entirely ineffective,” he said. “If the goal of Cambridge Analytica was to show personalised advertisements on Facebook, then what they did was stupid.”

<p>Even <a href="https://www.channel4.com/news/exposed-undercover-secrets-of-donald-trump-data-firm-cambridge-analytica">if the boasts</a> by former Cambridge Analytica CEO Alexander Nix and the statements of <a href="https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump">whistleblower Christopher Wylie</a> of the company’s influence are overblown as Kogan claims, the firm nevertheless hit on something with its approach of harvesting data in order to influence voter behaviour. Before that approach becomes commonplace, we should survey the whole moral panic around the scandal and see what lessons can be learnt.</p>

Use and abuse of data

<p>The first issue is our misunderstanding of consent. Kogan’s data-scrape may have been unethical, but he didn’t steal the data from those that used the app – they gave it willingly. When you use a social media platform you, by definition, are publishing your private life. More so, you effectively sell your private life on an open market through giving your consent for it to be monetised by that platform. </p>

<p>Following <a href="https://www.cnbc.com/2018/04/06/facebook-sheryl-sandberg-users-would-have-to-pay-to-opt-out-targeted-ads.html">admissions</a> by Facebook chief operating officer Sheryl Sandberg, we now know that “online privacy” settings exist only as a means to allow Facebook users to believe they have a consumer’s right to privacy, when in fact they are not the consumer, but the product itself. If privatisation is a process of transferring ownership from the public to the private realm, this means privacy itself has been privatised. You publish your data, making it public, so that private companies can capitalise on what this data says about you by selling you things.</p>

<p>This leads to a paradoxical situation I call neoprivacy, following neoliberalism’s similar disregard for and exploitation of the private individual. In a neoprivate world privacy exists to be exploited financially. The neoprivate individual both values their personal life so much that they publish it, yet is so neglectful of their privacy that, well, they publish it.</p>

<p>Cambridge Analytica’s stroke of genius was to combine two different kinds of datasets, let’s call them deep and broad. The deep psychometric tests of a small sample (from Kogan’s app) were combined with the broad online behaviour of a massive sample. With this they claimed they could <a href="https://theconversation.com/psychographics-the-behavioural-analysis-that-helped-cambridge-analytica-know-voters-minds-93675">predict people’s behaviour simply by their actions on Facebook</a>. </p>

<p>The firm sold this to political campaigns and lobbyists as their “<a href="https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump">secret weapon</a>”. This model shows a real understanding of social media by grounding it on people’s actions on Facebook – what they click on, read, and like – rather than their expressed statements. It’s <a href="http://www.mediamasters.fm/william-watkin">what you do that matters</a>, not what you say.</p>

cam ana in1
Putting information up for only commercial interests is a wasted opportunity. pixinoo/Shutterstock

Democratic data dividend

<p>I think this data-driven approach offers a democratic opportunity. Typically deep, expert research generates the evidence that informs policies. But data-driven governance appears increasingly disassociated from ordinary lives, with voters preferring crowd-pleasing factoids when it comes to major decisions. Indeed, suspicion of experts may even be a contributing factor in the <a href="https://www.bigissue.com/opinion/william-watkin-truth-post-truth/">rise of what could be called demagogcracy and fake news</a>. </p>

<p>In contrast, broad data is generated by people based on what they choose to do, not what an expert has asked them, or prompted them, to say. Neoprivate individuals feel a sense of ownership and investment when they share something on Facebook or Instagram. If anything online needs to be harvested, it is this sense of communal, social engagement. Yet our primal need for social engagement is both stymied by expert policy wonks with no grip of the grassroots, and monetised by the big platforms with no interest in civic society. </p>

<p>Evidence-based governance was instigated under former prime minister, Tony Blair, that was supposed to be a panacea for the uncertainties of political decision making. It has failed. In contrast, the activity-based influence of broad data is a political model that has been shown in the hands of Trump to be frighteningly effective. If we are to fix democracies then future leaders should engage with both – albeit more transparently than Cambridge Analytica did.</p>

<p>One final lesson: if we live in a neoprivate world, why couldn’t we monetise our own lives just as the big tech companies have? If Facebook knows enough about me to advise me on what sort of shelf brackets I need, why couldn’t this same level of insight be applied to more important, more technical, complex political decisions that need to be made by citizens, for their benefit?</p>

<p>If Cambridge Analytica can develop algorithms that are good predictors of our behaviour, shouldn’t that information be used to influence policy? Why shouldn’t politicians harvest it for the greater good rather than personal gain? Many <a href="http://criticallegalthinking.com/2017/05/10/michel-foucault-biopolitics-biopower/">biopolitical</a> theorists define our current age as that of <a href="https://theconversation.com/inside-view-prison-crisis-will-continue-until-we-hear-inmates-stories-83735">power through regulatory surveillance</a>; it is time that neoliberal democracies transitioned to power through participatory enhancement.</p>

<p><img src="https://counter.theconversation.com/content/98745/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" />Two worlds remain an absolute mystery: Facebook’s algorithms and why we vote the way we do. Place both those secrets in the public realm rather than in the hands of the highest bidder, and maybe democracy can develop its own app and fix itself. Now that’s what I call neoliberalism.</p>

<p><span><a href="https://theconversation.com/profiles/william-david-watkin-260077">William David Watkin</a>, Professor of Contemporary Philosophy and Literature, <em><a href="http://theconversation.com/institutions/brunel-university-london-1685">Brunel University London</a></em></span></p>

<p>This article was originally published on <a href="http://theconversation.com">The Conversation</a>. Read the <a href="https://theconversation.com/cambridge-analytica-used-our-secrets-for-profit-the-same-data-could-be-used-for-public-good-98745">original article</a>.</p>

Reported by:

Press Office, Media Relations
+44 (0)1895 268965
press-office@brunel.ac.uk