Statistical Learning Theory (2025), Graduate School of Informatics, Kyoto University
Lecturers: Hisashi Kashima, Makoto Yamada, Koh Takeuchi, and Kyohei Atarashi.
Day, time, and room: Monday, 8:45-10:15 / Research bldg. 8, Lecture room 2
This course will provide a in-depth exploration of the foundational theory and practical applications of statistical machine learning, which plays a significant role in statistical data analysis and data mining. We will primarily focus on supervised and unsupervised learning, with an emphasis on supervised learning. The course will cover essential theoretical concepts such as maximum likelihood estimation and Bayesian inference, as well as introduce the concept of Probably Approximately Correct (PAC) learning.
Throughout the course, you will gain familiarity with various probabilistic models and predictive algorithms, including logistic regression, perceptrons, and neural networks. Additionally, we will touch upon advanced topics like semi-supervised learning, transfer learning, and sparse modeling, providing you with insights into the latest developments in the field of machine learning. In addition, opportunities for hands-on data analysis exercises will also be provided.
[Course Materials]
1. Introduction to Machine Learning (Kashima)
2. Regression (Kashima)
3. Classification (Kashima)
[References]
- The corresponding PandA page is here.
- Lecture slides for the previous years:
2014,
2015,
2016,
2017,
2018,
2019,
2020,
2021,
2022
2023
2024