Want to keep learning?

This content is taken from the University of Southampton & MOOCAP's online course, Digital Accessibility: Enabling Participation in the Information Society. Join the course to learn more.

Skip to 0 minutes and 6 seconds PROFESSOR MIKE WALD: Subtitles are the name that’s given to text that is word-for-word what people are saying on a video or a film. And usually, historically, they were foreign language subtitles, which were a translation. So if, for example, it was an English movie and you are French, you had French subtitles. And that was a cheaper way then redubbing the whole movie in French. And so those were foreign language subtitles. But if you were deaf, and you not only can’t hear, you know, what people are saying, you actually can’t hear any of the sounds. So if a doorbell rings, if you have noises of kettles, you don’t know that’s happening.

Skip to 1 minute and 5 seconds So subtitles for the deaf and hard of hearing were in the same language as the film, but not only did you actually put the text on the screen of what was being said, you actually explained sounds that were happening. You might also explain who was talking because you might not realise, if you have two people or a group of people talking on the screen, and you just have the text of what people are saying, it’s quite difficult to work out who said what and you could have some total misunderstandings. In North America, they give a different term for subtitles for the deaf and hard of hearing. They call it captions.

Skip to 1 minute and 56 seconds So there are different terms for these things, but there is a misunderstanding about who these things are useful for. And actually, more people use captions, who aren’t deaf or hard of hearing because, if the telephone rings, if you have to turn the sound down for whatever reason, in airports, where there isn’t any sound, actually, the captions are really useful. Also, on the computer, if you have captions with a video, you can actually search the text of the captions, where you can’t search through the video. It’s also much quicker to read than it is to listen. So actually, you can get information much faster.

Skip to 2 minutes and 48 seconds Also, if you have a video, which is being streamed, you need quite a strong bandwidth, a good bandwidth and a good connection, for example, on your mobile device. Whereas, if you actually have the transcript, you can actually read that anytime. You can print it out. You can scan it. So there are lots of benefits of having captions, not only for deaf and hard of hearing people.

Subtitles and Captioning

In this video you will hear Professor Mike Wald explaining the differences in the terminology used for subtitles and captioning and how it differs across the world.

Having also heard about the ways captions can be used in different situations, can you think about any other ways captions may help those who may have a disability other than a hearing impairment?

(You may also want to consider closed captions that are not seen unless they are turned on by the user, as well as open captions that are always visible if available)

© This work is created by the University of Southampton and licensed under CC-BY 4.0 International Licence. Erasmus + MOOCs for Accessibility Partnership.

Share this video:

This video is from the free online course:

Digital Accessibility: Enabling Participation in the Information Society

University of Southampton