Skip main navigation

New offer! Get 30% off your first 2 months of Unlimited Monthly. Start your subscription for just £35.99 £24.99. New subscribers only T&Cs apply

Find out more

Algorithms: the villains and heroes of the ‘post-truth’ era

In this video, Dr Dave Beer looks at how algorithms are impacting on economics, policy, and society.
0.5
So Algorithms sound like a technical thing but they have powerful implications for how we live. Algorithms are embedded in the infrastructures that we live in, and they’re the decision-making bits of code. So as data is gathered about us, it’s the algorithms that then decide what is done with those data, or how those data are used, or the way those data fold back into our lives in lots of different ways. so through from financial trading,
30
where algorithmic systems decide what trades to make: what to buy and what to sell; through to the kind of systems
36.3
that we use in our everyday lives: what music should I listen to on Spotify? What TV show or film should I watch? These are the sorts of things that are decided by algorithmic systems. They make recommendations to us. They recommend the things we should consume. They make decisions about what’s visible. So the two most powerful algorithms in our lives are probably the page-rank algorithm on Google that decides the order of the results for a search term you might search for – so therefore that algorithm decides the things that you encounter; the things you’re exposed to; what you know of the world.
72.2
So the question to ask yourself is “what do I know of the world that wasn’t presented to me by through the page-rank algorithm on Google or a similar search engine. And similarly you might think about what sort of things do I read or watch? What games do I play? Who do I know on Social Media? And how many of those things are a product of the decision making of algorithms that decide and make suggestions to you and recommendations to you all the time in these various different ways. The second most powerful algorithm is probably the edge-rank algorithm on Facebook and that decides what news you see – what things you encounter in that social media environment.
112.7
So if you put those things together you start to see that our lives and what we know of the world are a product of algorithmic systems. Similarly if you go to something like borders – border decision-making processes – people can be red-flagged based upon the decisions algorithms are making with their data whether or not they’re a dangerous individual. So these algorithmic systems are embedded deeply in how we live.
138.6
Now that’s important because the way that we used to understand the social world was that it was the product of decisions made by human beings but now we’ve got a social world where the decisions are not just made by humans but by machines, And those then have profound outcomes for how we’re treated - what we know of the world and what happens to us in the world we’re a part of.
161.3
So algorithms and algorithmic decision making is probably the biggest question we face in understanding the social world today… ranging from things like the insurance premiums we get, the mortgage we get, through to our social media networks and the culture we consume, and onto the economic and financial trading systems around us. All of those things are now a product not just of humans making decisions but also what I’ve called the social power of algorithms.

Algorithms and algorithmic systems are all around us. In this video, Dr David Beer talks us through why they’re important, and why we need to be aware of their influence on our lives.

If we hear the word “algorithms”, there’s a chance we might be put off. It sounds like quite a technical thing: the preserve of computer scientists and mathematicians. Algorithms are decision-making rules written into computer code, and because we live in an environment that’s dominated by code and software, algorithms are a really important presence in our lives. They make decisions about us, and for us, all the time.

These decisions can make life easier for us, from offering recommendations on media to showing you relevant search results. Nobody has the time to look through the 6,290,000,000 that came up on Google when I just searched for ‘cat’. An algorithm is used to decide which pages show up in the first 10 results (the ones on the first page) and then which show up on the 39 other results pages Google offers me on this occasion. As I’m only seeing 400 results even if I look through each of these pages, how this decision-making process works is very important.

As well as looking for the word in question (in the case of search) or recent posts (in the case of social media), algorithms use a range of other factors to display content. Your search history, for example, is used by Google to assume which content you want to see this time.

In his 2011 TED Talk Beware online “filter bubbles”, Eli Pariser describes how different friends got very different results when searching for the word “Egypt”, with one of them not seeing any results relating to the protests happening at the time. Pariser also discusses Facebook, and the way its algorithms use the content you interact with to decide what to show you. This creates the ‘filter bubble’ effect, where people only see the kind of opinions they agree with because that’s the content they click on, creating in Pariser’s words “a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see”.

These algorithms are not neutral or objective. They bring with them human biases, are often tied up with marketing and profit, and their decisions can create feedback loops where assumptions and stereotypes become part of the system and inequality is perpetuated. Cathy O’Neil terms some algorithms “weapons of math destruction” after the damage that can be caused to society and democracy when algorithms reinforce inequality. In her book, O’Neil looks at how Facebook’s algorithms can influence whether people vote and which political messages they see.

Algorithms have a huge impact not only on the internet content we see, but on society, democracy, and culture. They are the decisions made for us, with positive or negative consequences.

When written with an awareness of civic consequences and used with transparency as to how these decisions are reached, algorithms can allow us to find and retrieve information, opening up the web for everyone. However, as we’ve seen, algorithms can also lead us away from discovery, creating filter bubbles that give certain views of the world, and can allow bias and discrimination to continue unquestioned.

Further reading:

  • Hannah Fry (2018). Hello World: How to Be Human in the Age of the Machine. London: Transworld.
  • Cathy O’Neil (2017). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. London: Penguin.
This article is from the free online

Digital Wellbeing

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now