What Will I Learn?
- Apply momentum to backpropagation to train neural networks
- Apply adaptive learning rate procedures like AdaGrad, RMSprop, and Adam to backpropagation to train neural networks
- Understand the basic building blocks of Theano
- Build a neural network in Theano
- Understand the basic building blocks of TensorFlow
- Build a neural network in TensorFlow
- Build a neural network that performs well on the MNIST dataset
- Understand the difference between full gradient descent, batch gradient descent, and stochastic gradient descent
- Understand and implement dropout regularization in Theano and TensorFlow
- Understand and implement batch normalization in Theano and Tensorflow
- Write a neural network using Keras
- Write a neural network using PyTorch
- Write a neural network using CNTK
- Write a neural network using MXNet
Requirements
- Be comfortable with Python, Numpy, and Matplotlib. Install Theano and TensorFlow.
- If you do not yet know about gradient descent, backprop, and softmax, take my earlier course, deep learning in Python, and then return to this course.
Who is the target audience?
- Students and professionals who want to deepen their machine learning knowledge
- Data scientists who want to learn more about deep learning
- Data scientists who already know about backpropagation and gradient descent and want to improve it with stochastic batch training, momentum, and adaptive learning rate procedures like RMSprop
- Those who do not yet know about backpropagation or softmax should take my earlier course, deep learning in Python, first
No comments:
Post a Comment