Skip to 0 minutes and 3 secondsHow important is it that we convey the certainty or uncertainty in a piece of data? It's absolutely critical, particularly if that data, I would say, is under public scrutiny. So for voting intention figures, the UK polling companies have just agreed a standard way of expressing, if you like, the level of uncertainty in what we're publishing. And that level of uncertainty is expressed based on our record in recent years at general election opinion polls. But I would say it's absolutely critical. If we want people to take notice of the findings, you have to remind them what are the margins of error or what are the caveats that you need to think about.

Skip to 0 minutes and 45 secondsAnd that could be as simple as these results were taken last week before X happened. We need to get people who put out information to also be used to trying to explain it well and to prove why their evidence is trustworthy. So we think it's really important that trust is earned where it should be earned. But it's not just placing the emphasis on the public to decide. I think if you're looking at what to do with apparently contradictory data, one of the questions is, are we looking attitudinal data, which is often all over the place because people are not coherent in their views, or behavioural data?

Skip to 1 minute and 29 secondsAnd when it's behavioural data, you can often check it against real world data, in any case, and that's what you would often go to. So if we had a very large number of Sun readers, we can go and check against the latest information on newspaper readership. And that would help us understand whether we've got it wrong, or whether it's actually within the margins of error. When we're working with journalists, for example, we do insist, actually, that we sign off the copy that they write before it goes live online or in the newspaper.

Skip to 2 minutes and 2 secondsSo that is, I think, an important check, so it's our responsibility written into the market research code of conduct, actually, that we work with our clients to make sure they don't misrepresent data. Where the problems come in can be in social media because there are thousands and thousands of people who like interpreting numbers in their own special way. And once it's released, frankly, we've lost control and it becomes a more difficult exercise. The internet, I think, the biggest change that it brings is that now anyone can put out information. And that makes it - information can spread quicker.

Skip to 2 minutes and 39 secondsAnd it also makes it harder for us to hold people accountable for the information they are using because you can have a meme that spreading online, and you don't know who's put it out. And it's harder - so you can't ask someone, can you correct that, because once it started spreading online - so that's one of the biggest challenges that we find. My advice for people, looking at data they come across, my first thing would be always try and trace back to where it's come from originally.

Skip to 3 minutes and 4 secondsSo we've seen some claims that, if you actually trace them back to 10-20 years ago based on a survey that was done 20 years ago, and when you first see the claim in where you've come across it, that's not necessarily evidence. My first thing would be trace it back to as far back as you can, and then look at what is it actually measuring, who's been asked, what has been asked with surveys, what are the surveys, is it just a kind of fluke survey, or are multiple surveys that back up what that particular survey is showing. I think the big thing would be to be sceptical, perhaps not over sceptical.

Skip to 3 minutes and 47 secondsBut if you think it's done by a reputable organisation, that might make you feel, OK, I can look at this. Then your questions are, when was it done, what questions did they ask, and to what extent is that giving a complete picture or not of this particular topic. And what the public do have at hand for most of the published information, if they're interested, they can go online and spend a lot of time looking at all of the questions and all of the numbers. So there's plenty of opportunity for people to be good data citizens and get even more involved in it, and go and have a look themselves.

Skip to 4 minutes and 21 secondsIpsos MORI's data suggests that there aren't that many things that we get right. We tend to overestimate a lot of things. So I don't necessarily know about so many examples of where we underestimate phenomena, and their surveys seem to suggest that there are huge gaps in our perceptions of the prevalence of many things in the UK, which obviously does affect our opinions on policies and our opinions on general issues of public debate. We've got one that we do every year called The Perils of Perception. And it's quite a simple idea. You ask people what they think is the reality in their country, and then we compare that with the actual reality.

Skip to 5 minutes and 3 secondsAnd it may not surprise you to learn that we're wrong about most things. And that's something which has really, really become, for market research, anyway, become viral because each time we do it - we do it in countries around the world - there's always stories for people to pick up on. And there's almost like a fun element to it but actually also quite a serious component to it because, basically, around the world, we have very little sense, actually, about what's really happening in the country around us that we think we know so well. One of the biggest things that we always do is ask people for their evidence.

Skip to 5 minutes and 37 secondsSo if we're seeing survey results being reported and we haven't actually seen the underlying methodology, we will always ask them, how have you done your survey? And I think the more people that can ask organisations that, then the more pressure they might feel to back up what they've said.

Hearing from the data collectors

In this video, our representatives from Ipsos Mori, YouGov and Full Fact discuss how they ensure their clients, and the press, present their data accurately.

They also discuss the perils of reporting survey results in the age of social media and share their advice on what the public can do to fact check any results they come across.

During this video, Simon Atkinson from Ipsos Mori and Amy Sippett from Full Fact discuss an annual piece of research called The Perils of Perception 2017 which looks at commonly held misconceptions across 38 countries.

Alongside the rate of teenage pregnancies and the percentage of the population who are foreign-born, what other widely held beliefs do you think aren’t supported by the data?

Share this video:

This video is from the free online course:

Making Sense of Data in the Media

The University of Sheffield