Summary for Week Four
That’s it! You’ve made it through the entire Advanced Machine Learning course! We will have a little course conclusion article after the last weekly test, but in this step we will review important aspects of week four.
This week we have looked at topics from a wide range of areas. You should make sure you have learnt:
1. Feature Engineering
You should understand the rationale for feature engineering and have a basic knowledge of the four basic approaches that seek to perform it: Expert/Domain knowledge, individual/pairwise statistical tests, feature transformations and subset selection. Regarding feature transformations, you should understand the idea behind them, and be able to tie this into your understanding of how neural networks work. You should also know how to perform scaling and centering and PCA.
2. Missing Data
You should know the three types of missing data, and what techniques (if any) should be used for different types. You should understand how to use the EM and Metropolis within Gibbs MCMC algorithms for missing value imputation.
3. Reinforcement Learning
You should understand what reinforcement learning is, and its most important characteristics. You should also understand and be able to apply the Multi-armed Bandit Optimization algorithm to real world problems.
4. Semi-supervised Learning
You should understand what semi-supervised learning is, and what the benefits to using it are when we have many unlabelled training cases and only a few labelled training cases. You should understand what the manifold and clustering assumptions are, and why they motivate attempting to discover an approximation to the data manifold. You should understand what the Laplacian Regularized Least Squares algorithm does and what the generalized representer theorem guarantees us.
Well done on finishing the final week of the course - we’ll congratulate you more in the course conclusion step!
© Dr Michael Ashcroft