History
- Frank Rosenblatt, ~ 1957: Perceptron
- Widrow and Hoff, ~ 1960: MLP
- Fukushima, 1980: Simple and Complex Cells
- Rumelhart et al.. 1986: Backpropagation
- Lecun et al.. 1998: Convolutional Layer Training
- Hinton and Salakhutdinov, 2006: Deep Neural Network (Weak)
- Alex Krizhevsky, 2012: modern incarnation of convolutional neural network
- Nowadays: Convolutional networks used widely
- ex) image classification, object detection, face recognition ...
Convolutional Neural Networks
Fully Connected Layer
- Stretch all pixels out
- obtain vector from stretching
- multiply by weight matrix
- activation done
- results in 10 outputs in upper image
Convolution Layer
- X stretch
- weights: filters
- slides over the image spatially
- compute dot product in every spatial location
- every computation will result in a number added by a bias value
- will end up in an activation map size of a filter
- always go through the full depth
- How filter slides over the image
- dot product will be done at every position
- center filter on top of every pixel in input
- have a choice as hyperparameter