The goal of this course is to provide an accelerated preparation for the more advanced courses in the Michaelmas term. This preparation includes several foundational topics in machine learning and through these we cover essential mathematical background and optimisation tools.
Aims:
Provide a thorough introduction into the topic of statistical inference including maximum-likelihood and Bayesian approaches. Introduce important tools from probability and statistics.
Introduce algorithms for regression, classification, clustering and sequence modelling. Through these use mathematical tools including linear algebra, eigenvectors and eigenvalues, multidimensional calculus, and calculus of variations.
Introduce basic concepts in optimisation and dynamic programming, including gradient ascent and belief propagation.
Objectives:
Understand the use of maximum-likelihood and Bayesian inference and the strengths and weaknesses of both approaches.
Implement methods to solve simple regression, classification, clustering and sequence modelling problems.
Implement simple optimisation methods (gradient and coordinate descent, EM) and dynamic programming (Kalman filter or Viterbi decoding).
Outcome: Not Provided