Coursera Deep Learning Specialization Notes
Jan 5, 2022
Table of contents
- Intro to Neural Network and Deep Learning
- Logistic regression (forward & backward)
- Neural network representation & activation functions
- Gradient descent, Back prop, & DNN
- Improving DNN: Hyperparameter tuning, Regularization, & Optimization
- General logic for improving DNN
- L2 regularization
- Dropout and Early-stop
- Normalize inputs
- Vanishing gradient & Initilization
- Gradient checking
- Batch & mini-batch gradient descent
- Optimizers
- Learning rate decay
- Hyperparameter tunning process
- Batch Normalization
- Softmax activation
- Structuring machine learning projects
- Speeding up the cycle: problem & solutions
- Satificing & Optimizing metrics
- Train, Dev, & Test split
- Error analysis
- Training & Testing data from different distribution
- Transfer learning, Multitask learning, & End-to-end learning
- Convolutional Neural Network
- Convolution as feature detector, padding, & stride
- Convolution over volume
- Convolution layer
- Pooling layer
- Classic CNN architecture
- Inception CNN and 1x1 convolution
- Mobile Net
- Sliding window for object detection
- YOLO algorithm
- Semantic segmentation
- Siamese network & Triplet loss
- Neural style transfer
- Recurrent Neural Network
- Use cases for RNN
- RNN forward & backward
- Different types of RNN
- Language models & Sampling new text
- GRU & LSTM
- Deep RNN
- Word embeddings
- Learning embeddings: word2vec & Glove
- Sentiment analysis
- Sequence to sequence models
- Beam search
- Attention mechanism
- Transformer architecture
Intro to Neural Network and Deep Learning
Logistic regression (forward & backward)
Neural network representation & activation functions
Gradient descent, Back prop, & DNN
Improving DNN: Hyperparameter tuning, Regularization, & Optimization
General logic for improving DNN
L2 regularization
Dropout and Early-stop
Vanishing gradient & Initilization
Gradient checking
Batch & mini-batch gradient descent
Optimizers
Learning rate decay
Hyperparameter tunning process
Batch Normalization
Softmax activation
Structuring machine learning projects
Speeding up the cycle: problem & solutions
Satificing & Optimizing metrics
Train, Dev, & Test split
Error analysis
Training & Testing data from different distribution
Transfer learning, Multitask learning, & End-to-end learning
Convolutional Neural Network
Convolution as feature detector, padding, & stride
Convolution over volume
Convolution layer
Pooling layer
Classic CNN architecture
Inception CNN and 1x1 convolution
Mobile Net
Sliding window for object detection
YOLO algorithm
Semantic segmentation
Siamese network & Triplet loss
Neural style transfer
Recurrent Neural Network
Use cases for RNN
RNN forward & backward
Different types of RNN
Language models & Sampling new text
GRU & LSTM
Deep RNN
Word embeddings
Learning embeddings: word2vec & Glove
Sentiment analysis
Sequence to sequence models
Beam search
Attention mechanism