Skip main navigation

Subtitles and Captioning

Subtitles and captioning with Professor Mike Wald explaining the origin of some of the terminology and why there are differences.
PROFESSOR MIKE WALD: Subtitles are the name that’s given to text that is word-for-word what people are saying on a video or a film. And usually, historically, they were foreign language subtitles, which were a translation. So if, for example, it was an English movie and you are French, you had French subtitles. And that was a cheaper way then redubbing the whole movie in French. And so those were foreign language subtitles. But if you were deaf, and you not only can’t hear, you know, what people are saying, you actually can’t hear any of the sounds. So if a doorbell rings, if you have noises of kettles, you don’t know that’s happening.
So subtitles for the deaf and hard of hearing were in the same language as the film, but not only did you actually put the text on the screen of what was being said, you actually explained sounds that were happening. You might also explain who was talking because you might not realise, if you have two people or a group of people talking on the screen, and you just have the text of what people are saying, it’s quite difficult to work out who said what and you could have some total misunderstandings. In North America, they give a different term for subtitles for the deaf and hard of hearing. They call it captions.
So there are different terms for these things, but there is a misunderstanding about who these things are useful for. And actually, more people use captions, who aren’t deaf or hard of hearing because, if the telephone rings, if you have to turn the sound down for whatever reason, in airports, where there isn’t any sound, actually, the captions are really useful. Also, on the computer, if you have captions with a video, you can actually search the text of the captions, where you can’t search through the video. It’s also much quicker to read than it is to listen. So actually, you can get information much faster.
Also, if you have a video, which is being streamed, you need quite a strong bandwidth, a good bandwidth and a good connection, for example, on your mobile device. Whereas, if you actually have the transcript, you can actually read that anytime. You can print it out. You can scan it. So there are lots of benefits of having captions, not only for deaf and hard of hearing people.
In this video you will hear Professor Mike Wald explaining the differences in the terminology used for subtitles and captioning and how it differs across the world.
Having also heard about the ways captions can be used in different situations, can you think about any other ways captions may help those who may have a disability other than a hearing impairment?
(You may also want to consider closed captions that are not seen unless they are turned on by the user, as well as open captions that are always visible if available)
© This work is created by the University of Southampton and licensed under CC-BY 4.0 International Licence. Erasmus + MOOCs for Accessibility Partnership.
This article is from the free online

Digital Accessibility: Enabling Participation in the Information Society

Created by
FutureLearn - Learning For Life

Our purpose is to transform access to education.

We offer a diverse selection of courses from leading universities and cultural institutions from around the world. These are delivered one step at a time, and are accessible on mobile, tablet and desktop, so you can fit learning around your life.

We believe learning should be an enjoyable, social experience, so our courses offer the opportunity to discuss what you’re learning with others as you go, helping you make fresh discoveries and form new ideas.
You can unlock new opportunities with unlimited access to hundreds of online short courses for a year by subscribing to our Unlimited package. Build your knowledge with top universities and organisations.

Learn more about how FutureLearn is transforming access to education