What Will You Learn?
You will learn the foundational concepts of neural networks and deep learning, understand the key technological trends in deep learning, build and apply fully connected deep neural networks, implement efficient (vectorized) neural networks, identify crucial parameters in neural network architecture, and successfully apply deep learning principles to your own applications. This course serves as a gateway to comprehending the capabilities and challenges of deep learning, providing you with the knowledge and skills to contribute to cutting-edge AI technology, advance your technical career, and confidently navigate the world of machine learning.
About This Course
Provider: Coursera
Format: Online
Duration: 24 hours to complete [Approx]
Target Audience: Intermediate
Learning Objectives: By the end, you will be familiar with the significant technological trends driving the rise of deep learning; build, train, and apply fully connected deep neural networks
Course Prerequisites: Intermediate Python skills: basic programming, understanding of for loops, if/else statements, data structures, A basic grasp of linear algebra & ML
Assessment and Certification: Earn a Certificate upon completion from the relevant Provider
Instructor: DeepLearning.AI
Key Topics: Artificial Neural Network, Backpropagation, Python Programming, Deep Learning, Neural Network Architecture
Topic Covered:
- - Welcome
- - What is a Neural Network?
- - Supervised Learning with Neural Networks
- - Why is Deep Learning taking off?
- - Binary Classification
- - Logistic Regression
- - Logistic Regression Cost Function
- - Gradient Descent
- - Derivatives
- - More Derivative Examples
- - Computation Graph
- - Derivatives with a Computation Graph
- - Logistic Regression Gradient Descent
- - Gradient Descent on m Examples
- - Vectorization
- - More Vectorization Examples
- - Vectorizing Logistic Regression
- - Vectorizing Logistic Regression's Gradient Output
- - Broadcasting in Python
- - A Note on Python/Numpy Vectors
- - Quick tour of Jupyter/iPython Notebooks
- - Neural Networks Overview
- - Neural Network Representation
- - Computing a Neural Network's Output
- - Vectorizing Across Multiple Examples
- - Explanation for Vectorized Implementation
- - Activation Functions
- - Why do you need Non-Linear Activation Functions?
- - Derivatives of Activation Functions
- - Gradient Descent for Neural Networks
- - Random Initialization
- - Deep L-layer Neural Network
- - Forward Propagation in a Deep Network
- - Getting your Matrix Dimensions Right
- - Why Deep Representations?
- - Building Blocks of Deep Neural Networks
- - Forward and Backward Propagation
- - Parameters vs Hyperparameters
- - What does this have to do with the brain?
0 Comments