Built-in accessibility features on mobile devices
Mobile phones and devices are an essential part of people’s lives today, empowering them to communicate with each other, to read, to work and to play.
Today’s mainstream smartphones and mobile devices are shipped with several accessibility features built into the operating system, allowing almost everybody access their apps and make calls or send messages.
In this step we will give you a short overview of the accessibility features available on the latest smartphones grouped into four categories:
Physical- and motor-skills
Firstly, we’ll look at the potential of built-in accessibility features generally.
For those with disabilities the potential of such devices is huge – just think of wayfinding applications for blind people, or communication applications for those who have speech and language difficulties like Tom.
But you may be asking how do people with disabilities operate mobile technologies, as most of the devices today just have a touchscreen and maybe one or two hardware buttons. At first glance it seems impossible for people such as Carole, who is blind, to operate these mobile devices.
Many smartphones have a built-in screen reader – it is part of the accessibility options. Carole doesn’t like wearing headsets while being outside. She wants to listen to the traffic, hear the difference various surfaces make when she taps with her long cane on the ground.
She doesn’t mind if people are listening when she hears what the screenreader announces. The volume is just good enough if she holds the mobile phone close to her head. Carole’s first mobile phone was a Nokia that had a keyboard, now she copes with a touch screen. You will see how a screenreader works on a mobile in later steps.
Mary has arthritis and loves hands-free voice control and Siri-like assistants (speak commands and dictate comments) on her mobile phone and tablet. However, she finds operating a touch screen is like wearing mittens and having to activate tiny buttons a nightmare. When making a phone call outdoors, Mary places her phone on her lap while sitting or wedges it somewhere. She has disabled landscape mode and uses portrait mode only. Later on this week we explain more about speech input when dexterity is limited.
Mary takes notes by voice recording and assigns appointments via her calendar app. Synchronization with her office computer is very convenient – as it is can be most people.
As a pedestrian, Carole also struggles with some aspects of navigation when using mobile apps. This is especially true when she encounters complex crossings, building works or there is no warning about an overhead branch. No navigation app helps the last 10 yards before she finds an entrance door.
Today’s major operating systems for mobile devices like Android or iOS have a built in screenreader that allows blind people to operate the device with simple touches and gestures.
People can touch the screen to hear what’s under their finger, can use gestures to explore the screen by manoeuvring from one element to another and to activate elements using a double tap gesture.
To get feedback on what is happening speech synthesis and vibrations are used.
Using these methods blind people can operate a smartphone with some skill and where applications are accessible, without any problems. A more in-depth description about the functionalities of a screenreader will be given in the next step.
This function magnifies a part of the screen so that people with low vision like Maria can read the content of the screen. This tools offer different modes, either magnifying the whole screen or allowing the user to see a zoomed area in a separate window while keeping the rest of the screen at its original size.
For people with low vision like Maria, this function allows users to select parts of the screen or the whole page and have it read it back out aloud.
The user is able to adjust the voice’s dialect and speaking rate and have words highlighted as they are being read. This is also a valuable tool for people that have problems reading the content of the screen as well as those doing something else while they listen!
This function recognises the voice of the user allowing words and numbers to be converted into text that are entered in text fields and text areas.
The user is also able to ask questions that the operating system tries to answer, send messages, place phone calls, and schedule meetings with just the use of their voice.
This function allows users like Maria to adjust the font size of the text that is displayed into a larger, easier to read size.
Greyscale and inverted colours
For people with colour vision deficiency like Alexander a higher contrast or a lack of colour helps the user to see what is on the display of the mobile device.
For deaf people like Lars or hard of hearing people like Susan a video call can help them to understand the conversation as they can catch every gesture and facial expression. Also communication with sign language is possible.
Visible and vibrating alerts
Visual and vibrating alerts for incoming phone calls and messages are ideal for people who have hearing problems.
Physical- and motor-skills
People like Mary have problems performing certain gestures like a pinch or a swipe gestures on the mobile device. Those gestures can be replaced with other gestures that the user can perform, for example a single tap. However, there are limitations, especially when users have to drag an item to a certain location on the screen which can be complicated.
Users are able to configure how long they need to touch the screen until the system recognises the action or whether repeated touches, caused for example by the trembling of the hand, should be ignored. Those with physical disabilities or dexterity difficulties can then put the finger anywhere on the screen and move to the item without mistakenly performing other actions.
Switch access scanning
This technology allows people who are only able to operate one or more switches due to severe physical or dexterity difficulties to use mobile devices. Users can move sequentially over each item on the screen and when they want to perform an action on the item - like a touch on the icon - then they simply press the switch. By this method very complex user interfaces can be operated with a single switch alone.
Further information on this input method will be given in the next step.
There are not that many built in tools on smartphones that are specifically available for those with cognitive disabilities. However, for those who have difficulties reading text or writing messages there are some options available. The alarms, timers and calendars can all help with time keeping and different colour settings can help with some visual stress issues.
Here are some other features that can be used:
Speech Recognition: to operate the mobile device, make calls, find numbers and dictate text into browsers to search or write messages etc
Speech Synthesis or Text to Speech: some people may find it easier to focus on the understanding of information as it is read aloud rather than reading
Touch Configuration: those supporting individuals who press some buttons inappropriately or accidentally, can disable them for example the home button so that users do not inadvertently exit an app.
Screen brightness: this can be dulled if the glare of black text on a white background is too much for some users such as Anna who has dyslexia and likes brown text on a sepia background.
Can you think of any other built in features that you feel would be helpful to our group of users?
© This work is a derivative of a work created by Johannes Kepler Universität Linz and Technische Universität Dresden, and licensed under CC-BY BY 4.0 International Licence adapted and used by the University of Southampton. Erasmus + MOOCs for Accessibility Partnership.