2017 Winter Project Week/DeepLearningMethodology
From NAMIC Wiki
Revision as of 06:14, 10 January 2017 by Mohsen.ghafoorian (talk | contribs) (→Basic concepts: (60-75 min))
Home < 2017 Winter Project Week < DeepLearningMethodology
This is a 3 hour introductory course on Deep Learning Methodology for Project Week #24.
Instructor: Mohsen Ghafoorian
Basic concepts: (60-75 min)
- loss function (categorical cross entropy, MSE)
- stochastic gradient descent
- update rules (SGD issue, Momentum, Nestrov, Adadelta, RMSProp, Adam)
- learning rate
- activation functions
- why non-linearities?
- Sigmoid (vanishing gradient problem, non-zero centered features), tanh
- relu (dead relu issue), leaky relu, prelu
- weight initialization
- regularization
- augmentation
- L1/L2
- dropout
- batch norm
- network babysitting (bad learning rate, bad initialization, overfitting)
State of the art CNN methods: (60 min)
- alexnet
- vgg net
- google net
- resnet
- highway nets
- dense nets
- GANs
Biomedical segmentation
- sliding window
- fully convolutional nets
- Unet