## Introduction to Machine Learning

**Introduction to Supervised, Unsupervised and Partially-Supervised Training Algorithms** by Dale Schuurmans - Machine Learning Summer School at Purdue, 2011.
This course will provide a simple unified introduction to batch training algorithms for supervised, unsupervised and partially-supervised learning.
The concepts introduced will provide a basis for the more advanced topics in other lectures.

The first part of the course will cover supervised training algorithms, establishing a general foundation through a series of extensions to linear prediction,
including: nonlinear input transformations (features), L2 regularization (kernels), prediction uncertainty (Gaussian processes), L1 regularization (sparsity),
nonlinear output transformations (matching losses), surrogate losses (classification), multivariate prediction, and structured prediction. Relevant optimization concepts will be
acquired along the way. The second part of the course will then demonstrate how unsupervised and semi-supervised formulations follow from a relationship
between forward and reverse prediction problems. This connection allows dimensionality reduction and sparse coding to be unified with regression, and clustering and
vector quantization to be unified with classification - even in the context of other extensions. Current convex relaxations of such training problems will be discussed.
The last part of the course covers partially-supervised learning - the problem of learning an input representation concurrently with a predictor. A brief overview of
current research will be presented, including recent work on boosting and convex relaxations.

**Machine Learning Summer School at Purdue, 2011**