Learning to swim in murky seas: Exploitation of the electorate by social media
Learn psychological self-defence and start resisting the dark arts of political communication.
Bias does not just affect decision-makers — it is at the root of many electoral behaviours. More to the point, the tendency of human beings to be biased can be exploited in political campaigns. So whilst my previous article was focused on criticising politicians, this article focuses on what the general public — you and I — must do to stay informed and compensate for our own forms of bias.
Looking at the “Brexit” referendum on leaving the EU, social media had a significant role in shaping the trend towards leave. The idea that all media and communications can shape what people think goes without saying. However, one of the most interesting aspects of social media is the new way that it shapes how information is combined and presented to the electorate. This means that understanding how social media works does not only help explain the surprising electoral dynamics we have witnessed in the UK and US, it also has wider lessons for those interested in human psychology.
Increasingly we are beginning to understand that social media creates particular types of informational bubbles. This has important implications to understanding elections. Before explaining the term informational bubble further, I will briefly explain the basics of social media advertising.
At its simplest level, the unique new feature of social media is the way that it creates news feeds that are automatically tailored to each user’s preferences. ‘Preference‘ in this context can mean a number of things. Some social media requires the user to manually enter their interests, but preference can be determined in other ways.
Imagine that you go on Facebook and regularly visit football-related pages, join football interest groups, and message your friends or update your status with comments about football. Facebook will record this interest in football and start presenting you with more adverts that have a football theme.
Similarly if you spend a lot of time on football related websites, then information about your browsing that is stored on your computer (in small files called ‘cookies’) can be used to target you with more information about football-related products, (such as season tickets, football-related gambling and sportswear).
Similarly, if you look at consumer products online, the advertising on your newsfeed will shortly present you with similar products to those you may have already bought or reviewed online. Furthermore, “liking” messages that others have posted can be used to infer your interests and preferences, and the new “react” button allows Facebook to gain an insight into the nature of your reaction, (such as love, angry, surprised laughing, etc). Everything you do online leaves a trail of information that can be used to target you with advertising.
So what is an “automated informational bubble”?
A “bubble” in this context is all the information on your newsfeed that is automatically selected for you by the social media technology. This will influence what news articles you will see, the type of issues that you are exposed to, and will largely filter out the information you tend not to read. As a result you are surrounded by a bubble of information that corresponds to the limited range presented to you by social media algorithms.
Of course you could always make a deliberate effort to read more widely in order to get a more diverse view of news stories. You could try to change your automated algorithm by joining groups and commenting on articles that you disagree with, or you could decide to manually look for corroborating information from multiple sources, such as by checking particularly serious claims against multiple newspapers.
The way that social media works has transformed how political campaigning systematically targets individuals, as it aims to influence their opinions about political issues. In the Brexit referendum the company at the heart of the leave campaign’s social media strategy was AggregateIQ, which is very strongly linked to another company at the centre of the US election, Cambridge Analytica. Closer examination of the links between the two companies and the owners reveal a murky network of media companies owned by US billionaire Robert Mercer. In turn, these groups have links to Donald Trump, and to the owner of the right-wing newspaper Breitbart, Stephen Bannon, which raises the prospect that elections in the UK may be influenced by non-UK groups with specific right wing political agendas.
To gain an insight into the power these companies have, consider the following quote by the founder of AgrregateIQ, Zack Massingham. As part of a longer article, he explains the basic approach of their social media campaigning as follows: “You always want to try and reduce everything down to the simplest form of the argument and then repeat those simple lines again and again and again and that becomes your brand.”
The effect described in this quote is a tried-and-tested influencing method psychologists call the mere-exposure effect. Simply by exposing people to things, causes them subconsciously to develop a preference for it. There is no reason that this would not work for messages such as:
“We want our country back”
“No border, No control”
“Who really runs Westminster?”
Targeting social media users with the same messages over and over creates a preference for the abstract ideas embodied in such messages, regardless of whether the messages could be shown to be true or false. Obviously people are sufficiently sophisticated that their opinions are based upon more than the mere-exposure effect. However influences from psychological effects and other effects of social media are likely to be cumulative, building up gradually in small steps.
Types of trust
The way that people trust information from social media is also fundamentally different to how they trust a newspaper. Social media is based upon people creating a network of friends and family with whom they share information. People are generally far more likely to trust their friends and family with information than third parties, and therefore once information circulates in a social network it has the power to influence not only the targeted individual but everyone in their friend list, especially if the information is liked or shared. When we perceive a source as credible (e.g. a friend) we are more likely to trust the information they share (even if the information originate from a third party source or if it could be wrong).
It is easy to see how false information can spread rapidly because people may more easily accept information perceived to originate from their friends, rather than from a source whose intent is to deceive and manipulate beliefs. The simple messages of a media campaign coupled with the amplifying effects of trust in friends and family creates a powerful medium that can disseminate political messages rapidly. However, one of the interesting things that has emerged about social media and trust is that many people will simply like or share a story based on its title and photo, without even reading the story itself. So stories can becomes viral: being read, liked and shared millions of times, spreading rapidly via social media networks, but without needing people to even read the article, let alone think about and evaluate it.
Social media played a big role in swaying the outcome of Brexit. It provided a platform for the banality of arguments playing on fears of immigration, and providing simple messages rather than detailed analysis and economic argument. As Andrew Mullen from Northumbria University discusses, the leave and remain campaigns employed similar strategies, but differed in their ability to connect with voters via social media.
In his words, “The critical difference, […], was that the leave campaign was much more successful at targeting than the Remain campaign. Although the result was close, that is the main reason why the leave campaign was victorious.” Looking to the future, with social media being able to sway elections and manipulate vast segments of society, what can be done to prevent falling prey to such advertising? There are several things that you can do.
- Read more widely and read news from a range of newspapers that provide well-researched and balanced political commentary.
- Think before you share and like posts — verify the information in a post. If in doubt do not share or like.
- Research the provider and source of the news. There are a number of known “fake news” sites. If these appear in your feed, you might be very sceptical of information found in the article.
- Use a browser app for Facebook to identify which media campaigns are targeting you. For the national election you can install the Who Targets Me? app in your chrome browser to give you an indication which media companies are targeting your profile.
We live in an exceptional age in history with powers of communication never seen before. It is this that throws us socially and politically into the deep end of a pool of murky waters. Considering what may be at stake, we better learn how to swim. Hopefully the tips above and the discussion will help you stand back from your own informational bubbles and equip you to be better and more critically informed during elections.
More in this series
No Pause for Thought? Brexit, Bias and Political Manipulation
The psychology of Brexit and contemporary politics, in a series of articles by Volker Patent. In this first article, we…
Privileged and Overconfident but Full Steam Ahead!
How negotiations can be affected by priviledge and overconfidence, in a third article on the psychology of Brexit and…
The devil is in the (complex) detail
A look at the implications of negotiating Brexit, in the fourth article from Volker Patent's series on the psychology…
Brexit and the Art of Negotiation
How to reduce your biases during complex negotiations, in the fifth article from Volker Patent's series on the…
Volker Patent is a Lecturer in Psychology in the Social Sciences faculty at The Open University. This article was originally published in May 2017 on OpenLearn. You should subscribe to our newsletter for more free courses, articles, games and videos.