Stephanie Jesper takes a look at the wellbeing app marketplace.
market is huge, and constantly growing (for example, over a thousand
mobile apps are added to the iOS App Store each day) which doesn’t make choosing between them any easier! There are many apps that claim to help with our mental health and wellbeing, but, as we saw in Week One, we need to be discerning in our choices and critical in our judgement.
We noted last week how big tech companies have begun to develop digital wellbeing tools
to monitor, and potentially regulate, usage of their smartphones: Google
developed digital wellbeing tools in 2018, and last year Google made their Digital Wellbeing app
(or equivalent controls) a required feature
for devices running their Android operating system.
Throughout this MOOC we’ve rehearsed some of the ongoing and valid concerns about the impact of technology on mental health, but as Dr Lina Gega explained
, technology can also play a role in improving mental health and wellbeing. Whilst not a replacement for face-to-face support or clinical interventions, digital technology and apps can be hugely beneficial
in combating loneliness and social isolation. In the UK, half a million older people go at least five days a week without coming into contact with anyone at all, and digital technologies are being employed
in an effort to overcome this, at least to some degree. There are other areas too where digital technologies can support our health and social care: Leeds-based project mHabitat
have a number of case-studies
and an open library
of online publications on the topic.
If you search for the term ‘wellbeing’ in your phone’s app store, you’ll probably find hundreds of results. And that’s just ‘wellbeing’. A search for ‘health’ will give even more results. Some of these apps will come with a pricetag. Others may be free, though ‘free’ may mean funded by advertising, or only free for so long or for so many features. There’s the old maxim “if the product’s free, you’re the product
”, and while it may not be universally true (some apps will be genuinely philanthropic), you do need to be careful. But you need to be careful whatever the pricetag. As Susan explained in Week One
, it’s important to be a critical consumer of information. And, as we also examined that week
, it’s important to be critical about the information we share.
In 2018 it was revealed that military personnel using the fitness app Strava
were inadvertently sharing rather detailed maps of secret military installations
by simply jogging around them. Last year, Privacy International
reported that period tracking apps were sharing personal data with Facebook
(something which Facebook and the named apps have denied
). It’s not surprising, then, that currently there are concerns being raised
about COVID-19 trackers.
Even if wellbeing apps aren’t sharing their data with third parties, we’re potentially telling them an awful lot about ourselves: sensitive, personal information from when we may be at our most vulnerable. Who’s collecting that data (and what they’re doing with it) matters a lot. As with anything, there’s a balance of risk: If we’re just logging our health, fitness, or menstrual cycle, we may be better using a locally hosted
spreadsheet or notepad rather than an online app. But we may find a custom-built app more convenient to use, and we may find it has other features which we feel trump the risks of what might be happening with our data. Perhaps this might be the case with sort of data gathering we looked at last week
that’s involved in pandemic contact trackers and the likes. With some things we may even, like the author of this iNews piece
feel that targetted advertising on social media is a reasonable price to pay, or we may find the prospect of a dodgy ad when we’re feeling down a step too far. And when the US government is exploring data harvesting
as a means to profiling people with mental illness, ostensibly in an effort to counter gun violence, there’s further reminder that what we share in the privacy of an app may prove to be far from private.
Attempts have been made to ‘whitelist
’ mental health apps: for instance UK-based research and consultancy group PatientView
’s My Health Apps
website (see their inclusion methodology
), and the NHS Apps Library
(see their inclusion methodology
). Our own Open Door
team at the University of York has a few suggested self-help apps on our Tips for wellbeing
page. If you’ve any suggestions of your own, feel free to add them in the comments.
But even when working from such recommendations, we need to be careful, and mindful of the fact that apps can change ownership or revise their approaches to privacy. It’s important to stress that such apps should not be seen as a replacement for face-to-face support or clinical interventions, and you should assess any app carefully
before deciding to use it. But the array of apps available is a useful reminder that there are many approaches to managing your mental health and wellbeing, and some of those are digital in nature. The online world needn’t be our enemy in this regard, but, as ever, we need to consider the risks as well as the benefits.
© University of York (author: Stephanie Jesper)