Skip main navigation

New offer! Get 30% off one whole year of Unlimited learning. Subscribe for just £249.99 £174.99. New subscribers only T&Cs apply

Find out more

Hearing from the data collectors

In this video, we hear from representatives from three companies involved in data collection and surveying of the public.
3.4
How important is it that we convey the certainty or uncertainty in a piece of data? It’s absolutely critical, particularly if that data, I would say, is under public scrutiny. So for voting intention figures, the UK polling companies have just agreed a standard way of expressing, if you like, the level of uncertainty in what we’re publishing. And that level of uncertainty is expressed based on our record in recent years at general election opinion polls. But I would say it’s absolutely critical. If we want people to take notice of the findings, you have to remind them what are the margins of error or what are the caveats that you need to think about.
45.4
And that could be as simple as these results were taken last week before X happened. We need to get people who put out information to also be used to trying to explain it well and to prove why their evidence is trustworthy. So we think it’s really important that trust is earned where it should be earned. But it’s not just placing the emphasis on the public to decide. I think if you’re looking at what to do with apparently contradictory data, one of the questions is, are we looking attitudinal data, which is often all over the place because people are not coherent in their views, or behavioural data?
88.8
And when it’s behavioural data, you can often check it against real world data, in any case, and that’s what you would often go to. So if we had a very large number of Sun readers, we can go and check against the latest information on newspaper readership. And that would help us understand whether we’ve got it wrong, or whether it’s actually within the margins of error. When we’re working with journalists, for example, we do insist, actually, that we sign off the copy that they write before it goes live online or in the newspaper.
122
So that is, I think, an important check, so it’s our responsibility written into the market research code of conduct, actually, that we work with our clients to make sure they don’t misrepresent data. Where the problems come in can be in social media because there are thousands and thousands of people who like interpreting numbers in their own special way. And once it’s released, frankly, we’ve lost control and it becomes a more difficult exercise. The internet, I think, the biggest change that it brings is that now anyone can put out information. And that makes it - information can spread quicker.
159.3
And it also makes it harder for us to hold people accountable for the information they are using because you can have a meme that spreading online, and you don’t know who’s put it out. And it’s harder - so you can’t ask someone, can you correct that, because once it started spreading online - so that’s one of the biggest challenges that we find. My advice for people, looking at data they come across, my first thing would be always try and trace back to where it’s come from originally.
183.7
So we’ve seen some claims that, if you actually trace them back to 10-20 years ago based on a survey that was done 20 years ago, and when you first see the claim in where you’ve come across it, that’s not necessarily evidence. My first thing would be trace it back to as far back as you can, and then look at what is it actually measuring, who’s been asked, what has been asked with surveys, what are the surveys, is it just a kind of fluke survey, or are multiple surveys that back up what that particular survey is showing. I think the big thing would be to be sceptical, perhaps not over sceptical.
227.1
But if you think it’s done by a reputable organisation, that might make you feel, OK, I can look at this. Then your questions are, when was it done, what questions did they ask, and to what extent is that giving a complete picture or not of this particular topic. And what the public do have at hand for most of the published information, if they’re interested, they can go online and spend a lot of time looking at all of the questions and all of the numbers. So there’s plenty of opportunity for people to be good data citizens and get even more involved in it, and go and have a look themselves.
261.2
Ipsos MORI’s data suggests that there aren’t that many things that we get right. We tend to overestimate a lot of things. So I don’t necessarily know about so many examples of where we underestimate phenomena, and their surveys seem to suggest that there are huge gaps in our perceptions of the prevalence of many things in the UK, which obviously does affect our opinions on policies and our opinions on general issues of public debate. We’ve got one that we do every year called The Perils of Perception. And it’s quite a simple idea. You ask people what they think is the reality in their country, and then we compare that with the actual reality.
302.8
And it may not surprise you to learn that we’re wrong about most things. And that’s something which has really, really become, for market research, anyway, become viral because each time we do it - we do it in countries around the world - there’s always stories for people to pick up on. And there’s almost like a fun element to it but actually also quite a serious component to it because, basically, around the world, we have very little sense, actually, about what’s really happening in the country around us that we think we know so well. One of the biggest things that we always do is ask people for their evidence.
336.8
So if we’re seeing survey results being reported and we haven’t actually seen the underlying methodology, we will always ask them, how have you done your survey? And I think the more people that can ask organisations that, then the more pressure they might feel to back up what they’ve said.

In this video, our representatives from Ipsos Mori, YouGov and Full Fact discuss how they ensure their clients, and the press, present their data accurately.

They also discuss the perils of reporting survey results in the age of social media and share their advice on what the public can do to fact check any results they come across.

During this video, Simon Atkinson from Ipsos Mori and Amy Sippett from Full Fact discuss an annual piece of research called The Perils of Perception 2017 which looks at commonly held misconceptions across 38 countries.

Alongside the rate of teenage pregnancies and the percentage of the population who are foreign-born, what other widely held beliefs do you think aren’t supported by the data?
This article is from the free online

Making Sense of Data in the Media

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now