Nothing Special   »   [go: up one dir, main page]

STA414 / STA2104 Winter 2022

Statistical Methods for Machine Learning II

The language of probability allows us to coherently and automatically account for uncertainty. This course will teach you how to build, fit, and do inference in probabilistic models. These models let us generate novel images and text, find meaningful latent representations of data, take advantage of large unlabeled datasets, and even let us do analogical reasoning automatically. This course will teach the basic building blocks of these models and the computational tools needed to use them.

What you will learn:

Instructors:

Syllabus

Missed Assessment Form

Piazza

Teaching Assistants:

Location:

Online for now. Zoom link will be sent by Quercus.

Reading

No required textbooks.

Tentative Schedule


Week 1 - Jan 10th & 11th - Course Overview and Graphical Model Notation

Coverage:

  1. Class Intro
  2. Topics covered
  3. Quick review of probabilistic models
  4. Graphical model notation: going from graphs to factorized joint probs and back

Materials:


Week 2 - Jan 17th & 18th - Decision Theory and Parametrizing Probabilistic Models

Learning Outcomes:

  1. Basic decision theory
  2. Understand basics of Directed Graphical Models
  3. Become comfortable working with conditional probabilities

Coverage:

  1. Decision Theory
  2. Conditional Probability Tables
  3. Numbers of parameters in different tables
  4. Plate notation.
  5. Examples of meaningful graphical models
  6. D-Separation, conditional independence in directed models
  7. Bayes Ball algorithm for determining conditional independence

Materials:

Lecture recordings:

Lecture notes:

Tutorial:

Helpful materials:


Week 3 - Jan 24th & 25th - Latent variables and Exact Inference

Learning Outcomes:

  1. How to write the joint factorization implied by UGMs
  2. How to reason about conditional indepdencies in UGMs
  3. How to do exact inference in joint distributions over discrete variables
  4. The time complexity of exact inference

Materials:

Lecture:

Tutorial:


Week 4 - Jan 31st & Feb 1st - Message Passing + Sampling

Learning Outcomes:

Materials:


Week 5 - Feb 7th & 8th - MCMC

  1. Metropolis Hastings
  2. Hamiltonian Monte Carlo
  3. Gibbs Sampling
  4. MCMC Diagnostics

Materials:

Suggested Reading:

Strongly Suggested Reading Conceptual Intorduction to Hamiltonian Monte Carlo

Week 6 - Feb 14th & 15th - Variational Inference

Materials:

Learning Outcomes:

  1. Optimizing distributions
  2. Optimizing expectations with simple Monte Carlo
  3. The reparameterization trick
  4. The evidence lower bound

Tutorial:


Week 7 - Feb 21st & 22nd - No classes - Reading week

Enjoy!


Week 8 - Feb 28th & Feb 29th - Midterm


Week 9 - March 7th & 8th - Language Models and attention

Coverage

Materials

Readings


Week 10 - March 14th & 15th - Amortized inference and Variational Autoencoders

Coverage:

  1. Amortized inference and Variational Autoencoders

Week 11 - March 21st & 22nd - Kernel methods and Gaussian Processes


Week 12 - March 28th & 29th - Neural Networks

Content

  1. Neural Networks intro
  2. Building Blocks of NNs
  3. Common Architectures
  4. Attention and Transformers

Materials


Week 13 - April 4th & 5th - Final Exam Review, Contrastive Learning, Interpretability

Other resources: