2017 Winter Project Week/DeepLearningMethodology

From NAMIC Wiki
Revision as of 13:57, 9 January 2017 by Tkapur (talk | contribs) (Created page with "This is a 3 hour introductory course on Deep Learning Methodology for Project Week #24. Instructor: Mohsen Ghafoorian ==Basic concepts: (60-75 min)== *loss function *stoc...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
Home < 2017 Winter Project Week < DeepLearningMethodology

This is a 3 hour introductory course on Deep Learning Methodology for Project Week #24.

Instructor: Mohsen Ghafoorian


Basic concepts: (60-75 min)

  • loss function
  • stochastic gradient descent
  • update rules
  • learning rate
  • activation functions
  • why non-linearities?
  • Sigmoid (vanishing grad, zero centered), tanh
  • relu (dead relu), leaky relu, prelu
  • weight initialization
  • regularization
  • augmentation
  • L1
  • L2
  • dropout
  • batch norm
  • network babysitting

State of the art CNN methods: (60 min)

  • alexnet
  • vgg net
  • google net
  • resnet
  • highway nets
  • dense nets
  • GANs

Biomedical segmentation

  • sliding window
  • fully convolutional nets
  • Unet