Contact FutureLearn for Support
Skip main navigation
We use cookies to give you a better experience, if that’s ok you can close this message and carry on browsing. For more info read our cookies policy.
We use cookies to give you a better experience. Carry on browsing if you're happy with this, or read our cookies policy for more information.

Displaying data and communicating data effectively

It is easy to spot examples of poor displays of data, their trends, and their meanings.

“I know you can’t make out the meanings here, but I wanted to show you everything.”
~ A conference presenter in 2003 when data projectors were new.

Gradesheet showing many numbers in columns and rows with no analysis

We instantly know when a display is not helpful. Consider the image above, what’s wrong with it?

There is no indication of what the rows or columns mean. The numbers range from 60 to 100 – is this the entire possible range of numbers? There are blank cells – is this because data is missing? It doesn’t communicate any findings.

Fixing such a graphic, or avoiding it altogether, is not always easy. How might you go about tackling this kind of problem?

  • Is there a member of your team that has skills in data presentation and can serve as a leader in organising data results for review?
  • Is there any resource elsewhere in your organisation (that you can call upon), for example, you might have an Office of Institutional Research, Academic Research Centre or Information Technology Office who can supply the necessary expertise.
  • Do you have any budget to support an evaluation consultant who could either assemble or guide the creation of a data warehouse designed for retrievability and consumption of results?

The more common circumstance is that members of the assessment team are self-taught data display artists. With today’s data analysis tools, the work has been made slightly easier, but there is still a learning curve about what to display for maximum.

So, what do you need to think about when communicating data effectively?

You don’t need to show everything

Hone in on the pertinent data. While all the data may be important, the audience is unlikely to have the time to devote to appreciating it in all its glory. It is the job of the assessment team to analyse, interpret and display the data that is pertinent to the findings.

Don’t assume stakeholders will recognise the connection between data and findings

The Assertion–Evidence Model provides a framework for addressing this challenge. Briefly, the display states an assertion, followed by the evidence to back it up, to increase audience comprehension (Garner & Alley, 2013). This model is employed by Penn State and other universities internationally to train engineers and scientists to communicate their research findings effectively. A key characteristic of Assertion–Evidence slides is a heavy reliance on graphics, keeping words to a minimum.

Don’t display a solo statistic, no matter how outstanding it may appear

Statistics are only meaningful when drawing comparisons. A ‘success rate of 92%’ may sound impressive on its own, but its meaning in context can be quite different – compare it with a success rate of 95% two years ago or a national success rate of 88%. In the first instance, it is clear that success rates in the organisation have fallen since two years ago, but in the second instance the organisation is clearly outperforming the national average.

Don’t shy away from complex statistics

The learning curve is steep for visualisation of complex data. The analyst must have a good sense of both the data and the readiness of the audience to absorb the information. Edward Tufte’s slopegraph approach presents examples from research as well as bus schedules.

Simple statistics can benefit from more complex solutions

The statistics may be very straightforward, for example, a tally of how many students live in neighbouring towns and counties could be displayed in a simple table. But if the information is important, the goal should be to make it memorable.

For an institution describing its student body, a graphic like the map in the image below is both memorable and easy to understand. Lake Washington Institute of Technology presents such graphics as interactive data dashboards (produced using Tableau) on their website devoted to research and data.

Following our discussions on the previous steps, you should pay particular attention to the note below the map, which indicates the measures taken by Lake Washington to ensure the privacy of their students’ personal information.

Heat map showing areas of high population density in certain parts of of Lake Washington
(C) Copyright Lake Washington Institute of Technology. Heat map showing areas of high student population density (click to enlarge).

Discussion

We’d like to hear about your experiences of presenting data to an audience.

If you have ever presented a dataset before, how did you communicate it?

Can you describe a time when you’ve watched a presentation about data that left you feeling confused?

What did you learn about the process and what could have been improved?

References
Garner, J. K., & Alley, M. P. (2013). ‘How the design of presentation slides affects audience comprehension: A case for the assertion-evidence approach’, International Journal of Engineering Education, vol. 29, no. 6, pp. 1564-1579.

Share this article:

This article is from the free online course:

Using Data to Improve Student Outcomes

American Association of Colleges for Teacher Education (AACTE)

Course highlights Get a taste of this course before you join:

  • Closing the loop
    Closing the loop
    article

    The American Association of Colleges for Teacher Education explain why it's important to 'close the loop' in any improvement initiative.

  • Acting on findings
    Acting on findings
    video

    Linda McKee from AACTE explains how to decide what actions to take after collecting and analysing data.