by Dr Andy Corbett
Updated 31 August 2023
The Kernel Trick: A first look at flexible machine learning
Course Description
Machine learning comes in many flavours. In the Using General Linear Models for Machine Learning course, we have seen lots of examples of parametric machine learning models, those where we take a guess at the form of the function mapping inputs to outputs. In this course, we are going to step things up a gear by forming a toolkit of nonparametric models, here, we'll focus on Kernel Machines.
But first, let's answer a few questions:

What are nonparametric models? They simply give the machine learning algorithm the freedom to find the function that best fits the data. These models are naturally flexible and can be applied to different types of dataset.

What are kernel machines? They are derived from simple parametric models  in our case, linear models  and extended via the kernel trick to nonparametric approximators. Specifically, we take a look at the Kernel Ridge Regressor (KRR) and the Support Vector Machine (SVM), two powerful machine learning models.
As we build up our toolkit of advanced models, we shall naturally consider the following topics:
Feature extraction, nonparametric modelling, decision surface analysis and Lagrange multipliers in optimisation.
These are a few of the big concepts we'll unpack in this course. They lay the foundation for many pattern recognition techniques used by professionals in modernday research  they also help form some of the most powerful modelling tools we have to date.
In this course, you will get to grips with:
 Nonparametric models, unlocking their advanced predictive power over parametric techniques.
 Implementing and tuning advanced machine learning techniques with detailed code walkthroughs.
 The theory underpinning the techniques used, so you have the expertise to interpret model predictions and understand the structure.
 How to quickly build a data science pipeline using the advanced models in the course.
This course follows neatly from where Tim left off in Using General Linear Models for Machine Learning In Timâ€™s course we covered the basic ideas around machine learning practice. In this course, we are using this foundation to start applying some advanced professionalstandard tools in order to get the best results.
Section 1: Donâ€™t Be Fooled by the Kernel Trick.
Weâ€™ll start unpacking the notion of â€˜nonparametric modelsâ€™ with an excellent prototype:
Projecting data features into a highdimensional space where a linear model can solve the task.
The kernel trick we play means we never have to write down the projection, nor solve for the associated highdimensional parameter vector. The derivation starts in our comfort zone  linear regression  and cleanly delivers a nonparametric model capable of cheaply understanding difficult nonlinear data.
In this section, we shall unpack this technique and inspect the various hyperparameter and kernel functions that need to be selected for the problem at hand.
Figure 1. Relaxing a linear leastsquares regression with the kernel trick allows for more general choices of predictive function.
Section 2: Support Vector Machines and They'll Support You.
One of the most powerful predictors and the cuttingedge of machine learning, before deep learning came along, Support Vector Machines are a valuable collection of models to keep in your toolkit. They can be applied to regression and classification tasks alike.
Alongside the kernel trick, these become highly flexible, nonparametric machine learning algorithms that give unique insight to the most impactful data points in your training set; these are identified as support vectors.
We'll examine support vector machines for a range of tasks and visualise how support vectors may be identified within the model. Crucially, support vector machines are able to sparsify a dataset of many samples though selecting only those necessary to generate a predictive model.
Figure 2. A linear SVM vs. a nonparametric 'kernel' SVM for predicting decision boundaries.
So, if you're ready to take your machine learning to the next level, let's get started!