CS230 Deep Learning
Course 1: Neural Networks and Deep Learning
Week 1: Introduction to Deep Learning
- Class introduction
- Examples of deep learning projects
- Course details
Week 2: Neural Networks Basics
Logictic Regression as a Neural Network
- C1W2L01: Binary Classification
- C1W2L02: Logistic Regression
- C1W2L03: Logistic Regression Cost Function
- C1W2L03: Gradient Descent
- C1W2L04: Derivatives
- C1W2L05: More Derivative Examples
- C1W2L06: Computation Graph
- C1W2L07: Derivatives with a Computation Graph
- C1W2L08: Logistic Regression Gradient Descent
- C1W2L09: Gradient Descent on m Examples
Python and Vectorization
- C1W2L10: Vectorization
- C1W2L11: More Vectorization Examples
- C1W2L12: Vectorizing Logistic Regression
- C1W2L13: Vectorizing Logistic Regression’s Gradient Output
- C1W2L14: Broadcasting in Python
- C1W2L15: A Note on Python/Numpy Vectors
- C1W2L16: Quick tour of Jupyter/iPython Notebooks
- C1W2L17: Explanation of Logistic Regression Cost Function (Optional)
Week 3: Shallow Neural Networks
- C1W3L01: Neural Networks Overview
- C1W3L02: Neural Network Representation
- C1W3L03: Computing a Neural Network’s Output
- C1W3L04: Vectorizing Across Multiple Examples
- C1W3L05: Explanation for Vectorized Implementation
- C1W3L06: Activation Functions
- C1W3L07: Why do you need Non-Linear Activation Functions?
- C1W3L08: Derivatives of Activation Functions
- C1W3L09: Gradient Descent for Neural Networks
- C1W3L10: Backpropagation Intuition (Optional)
- C1W3L11: Random Initialization
Week 4: Deep Neural Network
- C1W4L01: Deep L-layer Neural Network
- C1W4L02: Forward Propagation in a Deep Network
- C1W4L03: Getting your Matrix Dimensions Right
- C1W4L04: Why Deep Representations?
- C1W4L05: Building Blocks of Deep Neural Networks
- C1W4L06: Forward and Backward Propagation
- C1W4L07: Parameters vs Hyperparameters
- C1W4L08: Clarification For: What does this have to do with the brain?
- C1W4L09: What does this have to do with the brain?