Skip main navigation

New offer! Get 30% off your first 2 months of Unlimited Monthly. Start your subscription for just £35.99 £24.99. New subscribers only T&Cs apply

Find out more

Edge cases

The real world is really complicated and sometimes predicting what could happen is almost impossible. Dr Katrina Attwood considers the problem.
6.4
The decision making process is based on a model which is built from data. Data plays a crucial role in determining how smart the model will be. The real world is really complicated and sometimes predicting what could happen is almost impossible. For example what happens if a person wearing a duck costume crosses the road in front of your car? You, as a human driver, will not have any problems in immediately recognising them as a pedestrian wearing a costume and then stopping your car to let them cross. Will a self-driving car do the same?
38.9
Probably the model for pedestrian detection inside the self-driving car has never seen a person in a duck costume and then will have some problems in recognising them as pedestrian. The model could classify the person as a duck based on the similarity of the features between the person in the duck costume and an actual duck - things like a beak, wings, thin legs, the shape of the head and so on. Misclassification can cause bad decisions - and potentially accidents - based on some features like size which in a duck is much smaller than a human. The person in a duck costume is a classic example of an edge case. Edge cases represent situations where the model does not work as expected.
82
The model could be wrong because an edge case does not present the unique features representing its class but instead present features belonging to other classes. This is exactly what is happening
93.2
with the person in the duck costume: the figure is clearly human but the presence of the suit replaces the key features of a human with the key features of a duck, thus confusing the model. In this case a model classifies all of the images as sloths but a closer look reveals that some of them are pastries. Can you see why the model made the mistake? A human can clearly recognise the differences among the images but a model may find it more difficult because the pastries present typical features of a sloth, like the eyes and the face shape. What can we do to make the model robust to the edge cases?
129.2
The easy answer is to predict the edge cases and be sure they are included in the data used to train the model. Experts’ and different stakeholders’ opinions can help identify some edge cases but unfortunately in an uncontrolled world it is nearly impossible to predict every possible scenario. For example, in a model detecting a pedestrian it could be possible to think about someone walking with a bike, holding hands with another person or pushing a baby carrier, but it would be impossible thinking about all the different costumes that someone can wear during halloween or carnival. In research, edge cases detection is still considered an open challenge. Do you want to be the researcher solving this problem?

The real world is really complicated and sometimes predicting what could happen is almost impossible.

Dr Katrina Attwood considers the problems.

This article is from the free online

Intelligent Systems: An Introduction to Deep Learning and Autonomous Systems

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now