Skip main navigation

Asymmetries in health data

Asymmetries in Global Health Data

Bias in AI Systems

As we build AI systems, we are at risk of replicating the biases already inherent in society. If there are some groups that are under-represented in the data, the outputs of these AI systems will not be representative of the health realities of these groups, and may have a negative impact on these groups.

TASK: Increasing representation in datasets, creating algorithms to identify biases with accommodating human action, and increasing the inclusivity of diverse groups are all important considerations in ensuring that we tackle biases in the data we use to develop AI systems. 

Consider what actions you, your institution, your government could take to enact some of these suggestions.

Bias in Human Systems

It is important that as we approach AI critically, with the intention of ensuring ethical, equitable and inclusive application of these systems, we hold ourselves as humans to the same standards.

Beyond the asymmetries of health data, the lifecycle of bias in AI systems and unequal access to healthcare, there is the human factor. Humans are still the main actors in the healthcare system, and therefore the main source of bias.

Case Study: Gender Bias in Pain Triaging

In this section, we will take a brief but illustrative look at biases perpetually introduced by humans, in the health system.

One of the simplest health questions we have likely all been asked would be pain related, e.g. “where does it hurt?” or “how much does it hurt?”. 

These questions around pain play significant roles in triaging of patients in the Accident & Emergency settings. Illustratively, the Manchester Triage System is one of the most prominent globally adopted systems for assessing the urgency of patient need, and items related to pain are key indicators. This is a very, very important consideration as under-triaging can put patients in significant danger.

However, errors can occur in the measurement of pain and controlling for objective measure is absolutely essential here. 

Studies have shown that when men and women report similar levels of pain, a women’s pain is frequently perceived as less severe. This underestimation is influenced by stereotypes suggesting that women are more expressive or exaggerate their pain, leading clinicians to discount their reports. For instance, the above linked study published in 2021 in the Journal of Pain, “Gender Biases in Estimation of Others’ Pain” by Zhang et al. found that observers rated women’s pain as less intense than men’s, despite identical expressions of discomfort.​

TASK: Try to identify a use-case of an AI-based system that could indeed reduce the propagation of human bias in healthcare.

This article is from the free online

AI Ethics, Inclusion & Society

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now