Skip main navigation
We use cookies to give you a better experience, if that’s ok you can close this message and carry on browsing. For more info read our cookies policy.
We use cookies to give you a better experience. Carry on browsing if you're happy with this, or read our cookies policy for more information.

Vaccines

This week we are considering how public health dealt with infectious disease. In this step Gareth Millward discusses the introduction of safe, effective vaccines, but also lingering public resistance as well as implementation problems. How successful were these vaccines and how did the public respond?

The basic premise behind vaccination is an old one. Societies in India, China and the Middle East used scabs and pus from smallpox victims to protect themselves from the disease. This process – variolation – was introduced to the West in the early eighteenth century, and proved popular among the British aristocracy.

Image 1: Lady Mary Wortley Montagu, author, smallpox victim and wife to the British ambassador to the Ottoman Empire, brought variolation back to the British court in the 1700s. Public Domain.

In the late-1800s, doctors saw the benefits of using cowpox instead of smallpox. This new procedure – vaccination – was more useful as a public health measure, since it had less risk of cross-infection and could confer immunity even during smallpox outbreaks. The British government made it compulsory for children in 1853. This provoked significant resistance from groups such as the Anti-Compulsory Vaccination League who claimed citizens should have the freedom to choose (amongst other moral, religious and scientific objections). The introduction of an opt-out “conscientious objector” clause in 1898 effectively ended compulsion. 1,2

Image 2: While Edward Jenner did not “invent” vaccination, he was very successful in trialling and publicising his procedure. Memorial to Edward Jenner in Gloucester Cathedral, by Andrew Rabbott, CC-ASA.

Modern vaccination based on bacteriology and virology did not begin until the late nineteenth century. The invention of the microscope and germ theory led to a series of experiments with materials that could induce immunity in humans and animals. Louis Pasteur’s work with anthrax and rabies led to new prophylactic treatments that he called “vaccinations” in tribute to Edward Jenner.1

Britain was fairly slow to adopt these modern vaccines. BCG (an anti-tuberculosis vaccine) was used in France and Scandinavia and diphtheria immunisation was used in cities such as New York and Toronto in the 1920s. Although local authorities had the power to vaccinate if they so wished, they often did not have the resources to do so. It was not until the Second World War that the Ministry of Health started a national policy of immunisation against diphtheria. Fearing resistance, it made the procedure voluntary, but embarked on a massive advertising campaign. By the end of the war, rates of diphtheria had dropped significantly, boosting the profile of vaccination among the British medical community, government departments and the general public.

The childhood vaccination programme expanded in the new NHS era. Whooping cough (pertussis) and poliomyelitis vaccines were introduced for children in the 1950s, as was BCG in school-leavers. By the 1970s, there were routine vaccinations against measles and tetanus; though routine smallpox vaccination ended in 1971 and BCG in 2005. The current vaccination schedule also includes immunisations against Hib, meningitis A, B, C, W and Y, mumps, rubella and pneumonia.

Image 3: Health education played a role in encouraging people to vaccinate. This poster announced the extension of polio vaccine to young adults in 1960. Wellcome Library, CC BY 4.0

Throughout the post-war period, the British public broadly supported vaccination. Each new vaccine coincided with a visible drop in disease rates, adding to their reputation. Where parents may have been “apathetic”, health authorities invested in health education and in making vaccination more convenient. Multi-dose vaccines became more common, allowing easier administration and fewer uncomfortable visits to the clinic for parents and children. General Practitioners also took on a bigger role, meaning vaccination and routine check-ups could be combined.

However, some groups continued to be sceptical or out-right hostile. The Anti-Vaccination League and British Union for the Abolition of Vivisection opposed BCG and diphtheria immunisations in the 1940s and 1950s, arguing that they relied on animal cruelty, false science and vested commercial interests. The 1955 Cutter Incident in the United States – in which children were accidentally infected with polio after a manufacturing error – meant that British parents were slow to adopt polio vaccination. Reports of brain damage amongst receivers of the whooping cough vaccine led to a scare in the 1970s. Vaccination rates dropped, leading to an epidemic in 1978/79.

More recently, studies claiming a link between the measles-mumps-rubella vaccine (MMR) and autism caused a significant fall in vaccination rates in the early 2000s. Social media and the internet have allowed anti-vaccination groups to work across national boundaries.2

Image 4: Although there many public health tools reduced infectious disease over the twentieth century, vaccine scares showed that immunisation was still a major contributor. Gareth Millward, CC BY 4.0.

However, both the whooping cough and MMR crises were not just about science. The whooping cough crisis happened soon after the thalidomide compensation court case and during difficult discussions about the role of the welfare state. MMR was the latest medical scandal among many (including contaminated blood, Alder Hey, the Bristol heart and BSE crises). Public faith in vaccination was soon restored in both cases.

With pertussis, a welfare fund was created to provide compensation in the rare instances of vaccine damage; while with MMR much greater attention was paid to how and why parents may become “vaccine hesitant”. This has shown that parents are not usually staunchly “pro” or “anti” vaccination – but may have multiple reasons for not presenting their children, including inconvenience, scepticism or lack of information.

Share this article:

This article is from the free online course:

A History of Public Health in Post-War Britain

London School of Hygiene & Tropical Medicine

Contact FutureLearn for Support