We use cookies to give you a better experience. Carry on browsing if you're happy with this, or read our cookies policy for more information.

Skip main navigation

Posterior models for different kernels

In this article, the effect of constraining a Gaussian Process model for covariance functions is illustrated.
© University of Basel
In this short article, we will visually compare the prior and posterior for different Gaussian Process models. Specifically, we will look at how the variance is changed by incorporating known observations. This will improve our understanding for the characteristics of the models.

Models of smooth deformations

We start by discussing a Gaussian Process model of smooth deformations \(GP(0, k_\sigma)\), where \(k_\sigma\) is the Gaussian kernel
$$k_\sigma(x,x’) = \left(\begin{array}{cc} \exp(-\frac{||x – x’||^2}{\sigma^2}) & 0 \\ 0 &\exp(-\frac{||x – x’||^2}{\sigma^2}) \end{array}\right).$$
We define models for three different values of \(\sigma\). In the first model, we choose \(\sigma\) large compared to the size of the domain, in the second medium and in the third one a small value is chosen. The top row in Figure 1 shows a confidence region (in grey) of the corresponding Gaussian Process models. As expected, the prior distribution is the same for all three models, irrespective of the choice of \(\sigma\), as \(\sigma\) does not affect the size of the deformations but only their correlations. The bottom row in Figure 1 shows the posterior distributions given two observations (of the deformation) at the tip of the thumb and the little finger.
We notice for all the examples, that the variance at the point where the deformation is observed is small. Also, we can see that the larger \(\sigma\) is chosen, the stronger is the variance of the locations nearby this point affected by the observations. If \(\sigma\) is small, the effect is only very local. This is because the larger the value of \(\sigma\) is, the larger is the area where the deformations are influenced by the observations.
Confidence regions for models defined using a different Gaussian kernelsFigure 1: confidence region of a Gaussian kernel with \(\sigma\) large (left) medium (middle) and small (right). The upper row shows the prior and the bottom row the posterior after the points have been constrained using Gaussian Process regression.

Statistical shape models learned from data

We explore now the same for the statistical shape models \(GP(\mu_s, k_s)\) where the mean and covariance are estimated from data.
The top row in Figure 2 again shows the prior for three different models: the statistical model \(GP(\mu_s, k_s)\), an additive model where the flexibility is enlarged by modelling additional smooth deformation \(GP(\mu_s, k_s + k_\sigma)\) and a localised model where \(GP(\mu_s, k_s \odot k_\sigma)\) (see Step 4.6 for details of these models). The confidence regions look almost the same for all prior models, only the additive model shows a slighly increased variance, which is expected as in this case two covariance functions are added. The posterior models are more interesting. We see that in the standard statistical shape model, the variance is reduced everywhere. This is why statistical models are often said to be global models – the knowledge about a single point influences the deformations of the full shape.
The situation is not much different for the additive model, except that there is slightly more variability left everywhere. The local model, however, shows considerably more variance in regions that are further away from the observed deformations, as the multiplication with the Gaussian kernel breaks the global correlations and hence the information about the known deformation does not influence the deformations in these regions. Confidence regions for different statistical shape models Figure 2: confidence region of a statistical shape model (left) a statistical shape model with an additive Gaussian model (middle) and a localised statistical model (right). The upper row shows the prior and the bottom row of the posterior after the point has been constrained using Gaussian Process regression.
© University of Basel
This article is from the free online

Statistical Shape Modelling: Computing the Human Anatomy

Created by
FutureLearn - Learning For Life

Our purpose is to transform access to education.

We offer a diverse selection of courses from leading universities and cultural institutions from around the world. These are delivered one step at a time, and are accessible on mobile, tablet and desktop, so you can fit learning around your life.

We believe learning should be an enjoyable, social experience, so our courses offer the opportunity to discuss what you’re learning with others as you go, helping you make fresh discoveries and form new ideas.
You can unlock new opportunities with unlimited access to hundreds of online short courses for a year by subscribing to our Unlimited package. Build your knowledge with top universities and organisations.

Learn more about how FutureLearn is transforming access to education