Skip to 0 minutes and 3 secondsSo we're going to look at the kernel regression example exercise now. And this is a nice exercise to get us understanding how kernels work and kernel methods work. It's also a useful method on its own. So have a look at kernal regression ex1.r. Now, when we look at the code, you'll see that there's a lot of notes at the introduction. This is basically saying, look, we've implemented a kernel regression class here for you to use. And the reason we do that is it will allow the interested students amongst you to have a look at this code, and see exactly how it works and how kernel regression objects are built, use the kernel, and are used in prediction.

Skip to 0 minutes and 49 secondsFor those of you less interested, you can just use this as any third party implementation. But you will need to remember to source the code so that you can use these functions and the kernel regression object we've built in any code that you build. So let's just source that first.

Skip to 1 minute and 10 secondsOK. Now we'll go to the kernel regression example function. Like before, we'll quickly build a data set to work with, just synthetic data, x and y. We're going to split the data into training and test, the two-way split--

Skip to 1 minute and 32 seconds3/4 in training, 1/4 in test, randomly assigned. That's what we've just done. Now, to work with our kernal regression class, or to work with any kernel method, really, we need to specify a kernel. What we specify here is a simple Laplacian kernel, using L2-norm, which is just Euclidean distance, and sigma 1. Remember to look in the article to see all the definitions of a variety of kernels. But let's just create our kernel function. And we also need to specify a L2 regularisation penalty, a lambda parameter. So here's an arbitrary one. Now we can build the kernel regression model using our KENReg function and the training data.

Skip to 2 minutes and 21 secondsLike normal, we see explanations of the arguments these functions take here at the start of the file. We see it's just a target, tilde feature, the data we find these variables in, our kernel function, and the lambda L2 parameter. We run that, we've created our kernel regression model. We'll also create an ordinary least squares model for comparison. There we go. And now we'll evaluate their performance on the test data. Again, nothing unusual here. We'll output these results, see how the models went. And we see kernel regression does better than the ordinary least squares model. To actually see what's going on, let's plot our functions, or the model regression curves.

Skip to 3 minutes and 16 secondsThere we have the data set we're working with, the ordinarily square regression curve. It's of course, a straight line, the green straight line. And the kernel regression function is this blue very sharp regression curve. Now, just looking at that, we can see that it's overfitting. We know it does better than the ordinary least squares model. But nonetheless, just looking at it, we can see it's overfitting. So we'll try again with a wider kernel and a larger lambda.

Skip to 3 minutes and 51 secondsAnd so we'll make a second kernel regression model with our new kernel K2 that's wider, which is to say we've increased the sigma value in the exponent denominator, as well as a larger lambda value. We'll make the second model, and we'll see how it performs on the test data, and plot it onto our graph.

Skip to 4 minutes and 19 secondsAnd there we go with the red line. We see it's smoother. Might still be a little bit overfitting, but clearly much better. And it's also performing better than either of the original models. Of course, in reality, we'd spend a lot more time tuning these hyperparameters and getting a really lovely model. But that's pretty good for the small amount of time that it's taken us here.

Kernel Regression Exercise 1

The exercise for kernel regression. The associated code is in the Kernel Regression Ex1.R file. Interested students are encouraged to replicate what we go through in the video themselves in R, but note that this is an optional activity intended for those who want practical experience in R and machine learning.

We build a kernel regression model with an assigned kernel width and L2 regularization penalty,as well as an OLS model for comparison, on synthetic data. Based on visual analysis of the result, we produce a better kernel regression model with a larger kernel width and larger L2 penalty. The idea is to get some idea of what the effect of changing these hyper-parameters does to a kernel regression model.

Note that there is an implementation of kernel regression in the associated code file. Interested students can see how the mathematics of kernel regression is implemented, as well as, more generally, how statistical modeling algorithms can be implemented in R. To run this code, as used in the video, you will need to source the kernel regression implementation function.

Share this video:

This video is from the free online course:

Advanced Machine Learning

The Open University