Skip main navigation

New offer! Get 30% off your first 2 months of Unlimited Monthly. Start your subscription for just £35.99 £24.99. New subscribers only T&Cs apply

Find out more

Managing mixed health systems: the role of evidence

If more evidence is needed to guide decision making in health systems, what questions should be asked, and research methodologies used?
The ongoing management of complex health systems requires the use of evidence of various types. Acting without evidence is like walking blind. There is no way of making rational judgments about the best steps to take. Most of the time, the most important question
is not what works, in the sense of: “Should we be implementing a community health worker program or running mobile outreach clinics?”, though there is a place for such questions at specific times. On a daily, weekly, monthly basis, those charged with oversight of the health system need to know what is working, what is not working, and they need some guidance about where they might intervene to correct problems and enhance success. In other words, they need insight into why things are working and not working. All processes in the health system are part of emerging health system impacts over time.
There is rarely a moment when a manager can say: “That’s good. We’ve achieved what we set out to do. We’re done.” So many traditional approaches to evaluating public health practice fit poorly the needs of health system evaluation. Approaches such as the randomised control trial attempt to identify a discrete, one time impact of an intervention, or a cause and effect relationship that is rarely an accurate description of health system change. If something is working, by which we mean producing social benefits as intended, there are still questions about who it is working for, whether it is reaching everyone it should or could, whether it could produce even more social benefits, and so on.
When something is not working, either not producing social benefits or causing social harms, it can be stopped or amended. We can investigate whether it is producing benefits for some, if not for the average person, and if so, what can be learned from where it is working better and worse. The extent to which these questions can be answered to support health systems strengthening depends on the ability to generate the right evidence.
While it may be the case that evaluating clinical interventions is best served by trialling them and by establishing the counterfactual through a randomised control, the continually evolving nature of health systems, the interaction of context and action, the need to attend to time and variability as key points of interest implies different methodologies are needed. A typology of methodologies might be proposed, by which ever more detailed and precise answers as to how interventions are affecting health systems might be suggested, with three broad categories. Firstly, continuous and routine monitoring and evaluation methodologies. These take a range of forms, but are capable of routine and continuous application. And therefore, need to be relatively timely and resource frugal. Secondly, process evaluation methodologies.
For example, theory based evaluation, realist evaluation. These retain a normative question about whether an intervention works or achieves social benefit, but are alert to variability in experience between context, between potential beneficiaries, at different spatial and time horizon removed from the point of intervention, and then explaining, not only describing, impacts.
And then thirdly, social science research, including: action and implementation research, which might focus on questions beyond the impact of an intervention; exploring underlying relationships in the delivery of health care, and; institutional action to promote health. Consider a case study of the introduction of community health workers in a country’s health system. Monitoring and evaluation approaches usually involve the monitoring of key indicators of the functioning of the program.
Examples might include: measures of community health worker activity, like numbers of patients, conditions of patients, things done to patients; promotive and preventive activities such as information sessions on nutrition held. Analysis usually involves comparison with benchmarks, or targets, comparison across community health workers, or facilities, or districts, and trends over time, identifying discontinuities and changing trends, and raising questions about the factors underpinning trends and discontinuities. But it’s usually not able to identify those factors. Process evaluation methodologies, understanding how the community health workers and the program as a whole interacts with their context to produce impacts and outcomes. They consider the detail of the community health worker program as part of the set of factors shaping impacts and outcomes.
They inquire about what has happened and why, and get multiple perspectives on the theories that might explain the performance of the program. They often produce a program theory, or something similar, that identifies the important factors that have shaped the experience of the program. Social science research may be interested in a specific question or relationship embedded in the function of the community health worker program. For example, it may be interested in the role of gender in shaping experience. What are the gendered behaviour of community health workers, or communities expected to benefit from community health worker activity, and those in the institutions and structures that surround the program, and how these behaviours have influenced the emerging experience of the program.
Or it might ask questions about the role of household and community economies in shaping the experience of the program, with sub-questions such as what roles community health workers play in supporting the livelihoods of their family members and how their health related work fits with that role. How community members’ economic situations affect their interest in engaging with community health workers. Do they identify economic as well as health benefits? How the economic pressures within the health system affect the resourcing of the community health worker program and its performance. There is no best methodology. The typology moves from quicker and cheaper and requiring of more limited expertise, to more time consuming and expensive and requiring of greater expertise.
A choice of how to gather evidence to manage complex mixed health systems involves consideration and judgement about what evidence is needed for management decision making and the value of additional insight relative to the cost of a more time consuming and expensive process.

While it’s clear that much more evidence is needed to guide decision making in health systems, there is disagreement about the questions we should be asking and the types of research methodologies appropriate to answer those questions.

The types of evidence we seek are often constrained by what resources (including time and expertise) we have available to collect and analyse data. But what other factors (organisational, cultural, political) might shape what evidence we seek and the questions we ask?

It’s good to reflect, near the end of this course, on how the system itself may shape and constrain even what we study and learn from it.

This article is from the free online

Health Systems Strengthening

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now