Statistical Learning Theory (2022), Graduate School of Informatics, Kyoto University

Lecturers: Hisashi Kashima, Makoto Yamada, and Koh Takeuchi
Day, time, and room: Monday, 8:45-10:15 / Research bldg. 8, Lecture room 2


This course will cover in a broad sense the fundamental theoretical aspects and applicative possibilities of statistical machine learning, which is now a fundamental block of statistical data analysis and data mining. This course will focus on the supervised and unsupervised learning problems, including theoretical foundations such as a survey of probably approximately correct learning as well as their Bayesian perspectives and other learning theory frameworks. Several probabilistic models and prediction algorithms, such as the logistic regression, perceptron, and support vector machine will be introduced. Advanced topic such as online learning, transfer learning, and sparse modeling will be also introduced.

[Course Materials]
1. Introduction to machine learning
2. Regression
3. Classification
4. Kernel methods
5. Model evaluation and selection
6. Statistical learning theory
The materials for the latter half of this course are found at Prof. Yamada's website and Prof. Takeuchi's website.

[Homework]
1. Work on the quizzes instructed in the class


[References]
- Lecture slides for the previous years: 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021