MCS 548 - Mathematical Theory of Artificial Intelligence
University of Illinois - Chicago
Fall 2020


This course will introduce some of the central topics in computational learning theory, a field which approaches the question "whether machines can learn" from the perspective of theoretical computer science. We will study well defined and rigorous mathematical models of learning where it will be possible to give precise and rigorous analysis of learning problems and algorithms. A big focus of the course will be the computational efficiency of learning in these models. We will develop some provably efficient algorithms and explain why such provable algorithms are unlikely for other models.

Example topics include inductive inference, query learning, PAC learning and VC theory, Occam's razor, online learning, boosting, support vector machines, bandit algorithms, statistical queries, Rademiacher complexity, and neural networks.

This course is represented on the computer science prelim.

Basic Information

Syllabus: pdf
Time and Location: M-W-F 1:00pm - 1:50pm, online
Instructor Contact Information: Lev Reyzin, SEO 418, (312)-413-3745,
Required Textbook: Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar. Foundations of Machine Learning, second edition (available free online)
Optional Textbook: Shai Shalev-Shwartz and Shai Ben-David. Understanding Machine Learning: From Theory to Algorithms (available free online)
Office Hours: T 9:00-9:50am, F 2:00-2:50pm, online

Projects and Presentations

Problem Sets

problem set 1 due 9/25/20

Lectures and Readings

Note: lectures will have material not covered in the readings.

Lecture 1 (8/24/20)
covered material: intro to the course, preview of learning models
reading: section 7 of Computing Machinery and Intelligence by Turing (1950)

Lecture 2 (8/26/20)
covered meterial: introduction to PAC learning
reading: A Theory of the Learnable by Valiant (1984)

Lecture 3 (8/28/20)
covered meterial: PAC learning fo axis-aligned rectangles
reading: 2.1 of Mohri et al.

Lecture 4 (8/31/20)
covered meterial: PAC guarantee for the finite realizable case
reading: 2.2 of Mohri et al.
optional reading: Occam's Razor by Blumer et al. (1986)

Lecture 5 (9/2/20)
covered material: PAC learning without a perfect predictor
reading: 2.3 of Mohri et al.
optional reading: begin chapters 2 and 3 of Shalev-Shwartz and Ben-David for a different perspective on PAC learning

Lecture 6 (9/4/20)
covered material: agnostic PAC learning bounds
reading: 2.4 and 2.5 of Mohri et al.
optional reading: continue chapters 2 and 3 of Shalev-Shwartz and Ben-David for a different perspective on PAC learning

Lecture 7 (9/9/20)
covered material: McDiarmid's inequality and its relationship to Hoeffding's bound
reading D.1, D.7 of Mohri et al.

Lecture 8 (9/11/20)
covered material: Rademacher generalization bounds
reading 3.1 of Mohri et al.
optional reading: Rademacher Penalties and Structural Risk Minization by Koltchinskii (2001)

Lecture 9 (9/14/20)
covered material: the growth function, Maximal inequality, Massart's lemma
reading: 3.2, D.10 of Mohri et al.

Lecture 10 (9/16/20)
covered material: VC-dimension, Sauer's lemma, VC generalization bounds
reading: 3.3, 3.4 of Mohri et al.

Lecture 11 (9/18/20)
covered material: intro to weak learning, boosing the confidence
reading: 7.1 of Mohri et al.

Lecture 12 (9/21/20)
covered material: the boosting framework, AdaBoost
reading: 7.2 of Mohri et al.