Coursera Deep Learning Specialization Notes

Table of contents

  1. Intro to Neural Network and Deep Learning
    1. Logistic regression (forward & backward)
    2. Neural network representation & activation functions
    3. Gradient descent, Back prop, & DNN
  2. Improving DNN: Hyperparameter tuning, Regularization, & Optimization
    1. General logic for improving DNN
    2. L2 regularization
    3. Dropout and Early-stop
    4. Normalize inputs
    5. Vanishing gradient & Initilization
    6. Gradient checking
    7. Batch & mini-batch gradient descent
    8. Optimizers
    9. Learning rate decay
    10. Hyperparameter tunning process
    11. Batch Normalization
    12. Softmax activation
  3. Structuring machine learning projects
    1. Speeding up the cycle: problem & solutions
    2. Satificing & Optimizing metrics
    3. Train, Dev, & Test split
    4. Error analysis
    5. Training & Testing data from different distribution
    6. Transfer learning, Multitask learning, & End-to-end learning
  4. Convolutional Neural Network
    1. Convolution as feature detector, padding, & stride
    2. Convolution over volume
    3. Convolution layer
    4. Pooling layer
    5. Classic CNN architecture
    6. Inception CNN and 1x1 convolution
    7. Mobile Net
    8. Sliding window for object detection
    9. YOLO algorithm
    10. Semantic segmentation
    11. Siamese network & Triplet loss
    12. Neural style transfer
  5. Recurrent Neural Network
    1. Use cases for RNN
    2. RNN forward & backward
    3. Different types of RNN
    4. Language models & Sampling new text
    5. GRU & LSTM
    6. Deep RNN
    7. Word embeddings
    8. Learning embeddings: word2vec & Glove
    9. Sentiment analysis
    10. Sequence to sequence models
    11. Beam search
    12. Attention mechanism
    13. Transformer architecture

Intro to Neural Network and Deep Learning

Logistic regression (forward & backward)

drawing drawing

Neural network representation & activation functions

drawing drawing

Gradient descent, Back prop, & DNN

drawing drawing drawing

Improving DNN: Hyperparameter tuning, Regularization, & Optimization

General logic for improving DNN

drawing

L2 regularization

drawing

Dropout and Early-stop

drawing

Normalize inputs

drawing

Vanishing gradient & Initilization

drawing

Gradient checking

drawing

Batch & mini-batch gradient descent

drawing drawing

Optimizers

drawing drawing

Learning rate decay

drawing

Hyperparameter tunning process

drawing

Batch Normalization

drawing drawing

Softmax activation

drawing

Structuring machine learning projects

Speeding up the cycle: problem & solutions

drawing drawing

Satificing & Optimizing metrics

drawing

Train, Dev, & Test split

drawing

Error analysis

drawing drawing

Training & Testing data from different distribution

drawing

Transfer learning, Multitask learning, & End-to-end learning

drawing

Convolutional Neural Network

Convolution as feature detector, padding, & stride

drawing

Convolution over volume

drawing

Convolution layer

drawing

Pooling layer

drawing

Classic CNN architecture

drawing

Inception CNN and 1x1 convolution

drawing

Mobile Net

drawing

Sliding window for object detection

drawing

YOLO algorithm

drawing drawing

Semantic segmentation

drawing

Siamese network & Triplet loss

drawing drawing

Neural style transfer

drawing drawing

Recurrent Neural Network

Use cases for RNN

drawing

RNN forward & backward

drawing

Different types of RNN

drawing

Language models & Sampling new text

drawing

GRU & LSTM

drawing

Deep RNN

drawing

Word embeddings

drawing

Learning embeddings: word2vec & Glove

drawing

Sentiment analysis

drawing

Sequence to sequence models

drawing

Beam search

drawing

Attention mechanism

drawing

Transformer architecture

drawing drawing
Peeta Li
Peeta Li
PhD Student