- linear models
- feedforward neural networks
- basic linear algebra (matrices & vectors)
In this workshop we will cover the basics of Convolutional Neural Networks. We will explore the application of convolution operation to image processing. We will then see how it can be integrated into the neural network framework and why this type of network greatly outperforms all previous machine learning algorithms in image processing tasks. Finally, we will learn some practical aspects of implementing ConvNets, including transfer learning.
-
Theory
- why standard (feedforward) neural networks are not well-suited for image processing
- what are the building blocks of convnets:
- convolutional layers
- pooling
- fully-connected layers
- how are convnets better at image processing:
- local connectivity
- parameter sharing (tied weights)
- translational equivariance
- overview of a typical convnet architecture
- practical considerations:
- data augmentation
- transfer learning
-
Codelab
- convolutional layers
- pooling
- dropout
- convnet vs multilayer perceptron in Keras on Fashion-MNIST
- CS231n course notes from Stanford
- Goodfellow et al Deep Learning book
- Chris Olah's article on feature visualization
- Keras tutorial on transfer learning and data augmentation
- Series on Convolutional Neural Nets by Chris Olah
- Image convolution visualization - more focused on traditional image processing, but gives a good intuition on the mechanics of convolutions
- Backpropagation through convolutional layers