Kyoto University -Institute for Liberal Arts and Sciences
Fundamentals of Artificial Intelligence
Instructor: Fabien Cromieres (fabien AT nlp.ist.i.kyoto-u.ac.jp / fabien.cromieres AT gmail.com)
This class will focus on the Machine Learning aspect of AI (including "Deep Learning" and Neural Networks)
The (challenging) goal is to have a lecture that is both accessible to
first year university student while still explaining sufficiently the
important details.
The class will make use of Python and Jupyter. Therefore, you should install the Anaconda Python Distribution: www.anaconda.com
Although there is no official textbook for this class, the book by
Goodfellow, Bengio and Courville (https://www.deeplearningbook.org) is
a good reference and can be consulted freely. Obviously, the book is
more complex and cover much more topic than we will do.
Week 1: Introduction
Overview of AI applications and methods
Week 2: Introduction to Python
Quick, minimal introduction to Python, Jupyter, numpy and matplotlib.
Material:
Jupyter Notebook
(or html version)
Week 3-4: Minimizing functions with Gradient Descent
We review the concepts of functions, functions of several variables,
derivative. And we study the Gradient Descent algorithm that will allow
us to minimize any function.
Material:
Slides: v2018 v2019
Jupyter Notebook I (or html version)
Jupyter Notebook II (or html version)
Week 5: Learning simple functions (Regression)
Given a set of example data, we see how we can learn a function that will help us predict new examples.
Material:
Slides: v2018 v2019
Jupyter Notebook (or html version)
Week6: Learning more complex Functions
Same as previous week, but considering functions with more than one variable.
Material:
Slides: v2018 v2019
Jupyter Notebook (or html version)
Week7: Classification
Now we see how we can learn a function that will learn to predict a class given some examples.
Material:
Slides: v2018 v2019
Week8: Introduction to Neural Networks
First look at neural networks
Material:
Slides 2018 Slides 2019
Week 9: Neural Networks Architecture and Backpropagation
How we combine Neurons. The backpropagation algorithm for computing gradient.
Material:
Slides 2018 Slides 2019
Week 10: Fully Connected Layers
Looking at one of the most important type of Neural Network
architectures: Feed-Forward with Fully-Connected Layers. And how they
relate to Matrix Multiplication.
We also look at how to implement a Fully-Connected Feed-Forward Neural Network in Chainer (see Notebook)
Material:
Slides: v2018 v2019
Jupyter Notebook (or html version)
Week 11:Computer Vision I
First look at computer vision. What is an image for a computer. What are Convolution Layers.
Material:
Slides 2018 Slides 2019
Week 12: Computer Vision II
How to build an Image Recognition Neural Network with Convolution Layers, Max-Pooling Layers and Fully-Connected Layers
Material:
Slides 2018 Slides 2019
Week 13: Computer Vision III: Training a real Image Classifier
Implementing and training a real Image Recognition Neural Network in Chainer.
Material:
Jupyter Notebook (or html version)
Week 14: Natural Language Processing
A quick look at recurrent architectures and how they are used to process text.
Material:
Slides 2019