Introduction for Week 4
Welcome to the final week of the Advanced Machine Learning course. No video introduction this week, I am afraid - though it saves me looking at myself, which I am happy about!
This week we will start off by looking at feature engineering: How you can work with your original raw features to generate features that are more powerful when used in statistical modeling algorithms. In particular we will focus on principal component analysis (PCA).
After that, we will look at missing data and missing data imputation algorithms. These algorithms allow us to estimate values of missing data so that we can produce a ‘completed’ data set that can be used by statistical modeling algorithms. We will see that the expectation maximization (EM) and Metropolis in Gibbs MCMC algorithms can be used for this.
Finally, we will briefly look at some more complicated types of machine learning: Reinforcement learning and semi-supervised learning. We will examine an example algorithm in each case, and discuss what such approaches can do.
So well done for getting this far, and we hope you enjoy this last week!
© Dr Michael Ashcroft