The Complete Neural Networks Bootcamp: Theory, Applications
The Complete Neural Networks Bootcamp: Theory, Applications, available at $84.99, has an average rating of 4.47, with 306 lectures, 2 quizzes, based on 2389 reviews, and has 21102 subscribers.
You will learn about Understand How Neural Networks Work (Theory and Applications) Understand How Convolutional Networks Work (Theory and Applications) Understand How Recurrent Networks and LSTMs work (Theory and Applications) Learn how to use PyTorch in depth Understand how the Backpropagation algorithm works Understand Loss Functions in Neural Networks Understand Weight Initialization and Regularization Techniques Code-up a Neural Network from Scratch using Numpy Apply Transfer Learning to CNNs CNN Visualization Learn the CNN Architectures that are widely used nowadays Understand Residual Networks in Depth Understand YOLO Object Detection in Depth Visualize the Learning Process of Neural Networks Learn how to Save and Load trained models Learn Sequence Modeling with Attention Mechanisms Build a Chatbot with Attention Transformers Build a Chatbot with Transformers BERT Build an Image Captioning Model This course is ideal for individuals who are Anyone who in interested in learning about Neural Networks and Deep Learning It is particularly useful for Anyone who in interested in learning about Neural Networks and Deep Learning.
Enroll now: The Complete Neural Networks Bootcamp: Theory, Applications
Summary
Title: The Complete Neural Networks Bootcamp: Theory, Applications
Price: $84.99
Average Rating: 4.47
Number of Lectures: 306
Number of Quizzes: 2
Number of Published Lectures: 306
Number of Published Quizzes: 2
Number of Curriculum Items: 308
Number of Published Curriculum Objects: 308
Original Price: $89.99
Quality Status: approved
Status: Live
What You Will Learn
- Understand How Neural Networks Work (Theory and Applications)
- Understand How Convolutional Networks Work (Theory and Applications)
- Understand How Recurrent Networks and LSTMs work (Theory and Applications)
- Learn how to use PyTorch in depth
- Understand how the Backpropagation algorithm works
- Understand Loss Functions in Neural Networks
- Understand Weight Initialization and Regularization Techniques
- Code-up a Neural Network from Scratch using Numpy
- Apply Transfer Learning to CNNs
- CNN Visualization
- Learn the CNN Architectures that are widely used nowadays
- Understand Residual Networks in Depth
- Understand YOLO Object Detection in Depth
- Visualize the Learning Process of Neural Networks
- Learn how to Save and Load trained models
- Learn Sequence Modeling with Attention Mechanisms
- Build a Chatbot with Attention
- Transformers
- Build a Chatbot with Transformers
- BERT
- Build an Image Captioning Model
Who Should Attend
- Anyone who in interested in learning about Neural Networks and Deep Learning
Target Audiences
- Anyone who in interested in learning about Neural Networks and Deep Learning
This course is a comprehensive guide to Deep Learning and Neural Networks. The theories are explained in depth and in a friendly manner. After that, we’ll have the hands-on session, where we will be learning how to code Neural Networks in PyTorch, a very advanced and powerful deep learning framework!
The course includes the following Sections:
——————————————————————————————————–
Section 1 – How Neural Networks and Backpropagation Works
In this section, you will deeply understand the theories of how neural networks and the backpropagation algorithm works, in a friendly manner. We will walk through an example and do the calculations step-by-step. We will also discuss the activation functions used in Neural Networks, with their advantages and disadvantages!
Section 2 – Loss Functions
In this section, we will introduce the famous loss functions that are used in Deep Learning and Neural Networks. We will walk through when to use them and how they work.
Section 3 – Optimization
In this section, we will discuss the optimization techniques used in Neural Networks, to reach the optimal Point, including Gradient Descent, Stochastic Gradient Descent, Momentum, RMSProp, Adam, AMSGrad, Weight Decay and Decoupling Weight Decay, LR Scheduler and others.
Section 4 – Weight Initialization
In this section,we will introduce you to the concepts of weight initialization in neural networks, and we will discuss some techniques of weights initialization including Xavier initialization and He norm initialization.
Section 5 – Regularization Techniques
In this section, we will introduce you to the regularization techniques in neural networks. We will first introduce overfitting and then introduce how to prevent overfitting by using regularization techniques, inclusing L1, L2 and Dropout. We’ll also talk about normalization as well as batch normalization and Layer Normalization.
Section 6- Introduction to PyTorch
In this section, we will introduce the deep learning framework we’ll be using through this course, which is PyTorch. We will show you how to install it, how it works and why it’s special, and then we will code some PyTorch tensors and show you some operations on tensors, as well as show you Autograd in code!
Section 7 – Practical Neural Networks in PyTorch – Application 1
In this section, you will apply what you’ve learned to build a Feed Forward Neural Network to classify handwritten digits. This is the first application of Feed Forward Networks we will be showing.
Section 8 – Practical Neural Networks in PyTorch – Application 2
In this section, we will build a feed forward Neural Network to classify weather a person has diabetes or not. We will train the network on a large dataset of diabetes!
Section 9 – Visualize the Learning Process
In this section, we will visualize how neural networks are learning, and how good they are at separating non-linear data!
Section 10 – Implementing a Neural Network from Scratch with Python and Numpy
In this section, we will understand and code up a neural network without using any deep learning library (from scratch using only python and numpy). This is necessary to understand how the underlying structure works.
Section 11 – Convolutional Neural Networks
In this section, we will introduce you to Convolutional Networks that are used for images. We will show you first the relationship to Feed Forward Networks, and then we will introduce you the concepts of Convolutional Networks one by one!
Section 12 – Practical Convolutional Networks in PyTorch
In this section, we will apply Convolutional Networks to classify handwritten digits. This is the first application of CNNs we will do.
Section 13- Deeper into CNN: Improving and Plotting
In this section, we will improve the CNN that we built in the previous section, as well show you how to plot the results of training and testing! Moreover, we will show you how to classify your own handwritten images through the network!
Section 14 – CNN Architectures
In this section, we will introduce the CNN architectures that are widely used in all deep learning applications. These architectures are: AlexNet, VGG net, Inception Net, Residual Networks and Densely Connected Networks. We will also discuss some object detection architectures.
Section 15- Residual Networks
In this section, we will dive deep into the details and theory of Residual Networks, and then we’ll build a Residual Network in PyTorch from scratch!
Section 16 – Transfer Learning in PyTorch – Image Classification
In this section, we will apply transfer learning on a Residual Network, to classify ants and bees. We will also show you how to use your own dataset and apply image augmentation. After completing this section, you will be able to classify any images you want!
Section 17- Convolutional Networks Visualization
In this section, we will visualize what the neural networks output, and what they are really learning. We will observe the feature maps of the network of every layer!
Section 18 – YOLO Object Detection (Theory)
In this section, we will learn one of the most famous Object Detection Frameworks: YOLO!! This section covers the theory of YOLO in depth.
Section 19 – Autoencoders and Variational Autoencoders
In this section, we will cover Autoencoders and Denoising Autoencoders. We will then see the problem they face and learn how to mitigate it with Variational Autoencoders.
Section 20 – Recurrent Neural Networks
In this section, we will introduce you to Recurrent Neural Networks and all their concepts. We will then discuss the Backpropagation through time, the vanishing gradient problem, and finally about Long Short Term Memory (LSTM) that solved the problems RNN suffered from.
Section 21 – Word Embeddings
In this section, we will discuss how words are represented as features. We will then show you some Word Embedding models. We will also show you how to implement word embedding in PyTorch!
Section 22 – Practical Recurrent Networks in PyTorch
In this section, we will apply Recurrent Neural Networks using LSTMs in PyTorch to generate text similar to the story of Alice in Wonderland! You can just replace the story with any other text you want, and the RNN will be able to generate text similar to it!
Section 23 – Sequence Modelling
In this section, we will learn about Sequence-to-Sequence Modelling. We will see how Seq2Seq models work and where they are applied. We’ll also talk about Attention mechanisms and see how they work.
Section 24 – Practical Sequence Modelling in PyTorch – Build a Chatbot
In this section, we will apply what we learned about sequence modeling and build a Chatbot with Attention Mechanism.
Section 25 – Saving and Loading Models
In this section, we will show you how to save and load models in PyTorch, so you can use these models either for later testing, or for resuming training!
Section 26 – Transformers
In this section, we will cover the Transformer, which is the current state-of-art model for NLP and language modeling tasks. We will go through each component of a transformer.
Section 27 – Build a Chatbot with Transformers
In this section, we will implement all what we learned in the previous section to build a Chatbot using Transformers.
Course Curriculum
Chapter 1: How Neural Networks and Backpropagation Works
Lecture 1: BEFORE STARTING…PLEASE READ THIS
Lecture 2: What Can Deep Learning Do?
Lecture 3: The Rise of Deep Learning
Lecture 4: The Essence of Neural Networks
Lecture 5: The Perceptron
Lecture 6: Gradient Descent
Lecture 7: The Forward Propagation
Lecture 8: Before Proceeding with the Backpropagation
Lecture 9: Backpropagation Part 1
Lecture 10: Backpropagation Part 2
Chapter 2: Loss Functions
Lecture 1: Mean Squared Error (MSE)
Lecture 2: L1 Loss (MAE)
Lecture 3: Huber Loss
Lecture 4: Binary Cross Entropy Loss
Lecture 5: Cross Entropy Loss
Lecture 6: Softmax Function
Lecture 7: Softmax with Temperature: Controlling your distribution
Lecture 8: KL divergence Loss
Lecture 9: Contrastive Loss
Lecture 10: Hinge Loss
Lecture 11: Triplet Ranking Loss
Lecture 12: Practical Loss Functions Note
Chapter 3: Activation Functions
Lecture 1: Why we need activation functions
Lecture 2: Sigmoid Activation
Lecture 3: Tanh Activation
Lecture 4: ReLU and PReLU
Lecture 5: Exponentially Linear Units (ELU)
Lecture 6: Gated Linear Units (GLU)
Lecture 7: Swish Activation
Lecture 8: Mish Activation
Chapter 4: Regularization and Normalization
Lecture 1: Overfitting
Lecture 2: L1 and L2 Regularization
Lecture 3: Dropout
Lecture 4: DropConnect
Lecture 5: Normalization
Lecture 6: Batch Normalization
Lecture 7: Layer Normalization
Lecture 8: Group Normalization
Chapter 5: Optimization
Lecture 1: Batch Gradient Descent
Lecture 2: Stochastic Gradient Descent
Lecture 3: Mini-Batch Gradient Descent
Lecture 4: Exponentially Weighted Average Intuition
Lecture 5: Exponentially Weighted Average Implementation
Lecture 6: Bias Correction in Exponentially Weighted Averages
Lecture 7: Momentum
Lecture 8: RMSProp
Lecture 9: Adam Optimization
Lecture 10: SWATS – Switching from Adam to SGD
Lecture 11: Weight Decay
Lecture 12: Decoupling Weight Decay
Lecture 13: AMSGrad
Chapter 6: Hyperparameter Tuning and Learning Rate Scheduling
Lecture 1: Introduction to Hyperparameter Tuning and Learning Rate Recap
Lecture 2: Step Learning Rate Decay
Lecture 3: Cyclic Learning Rate
Lecture 4: Cosine Annealing with Warm Restarts
Lecture 5: Batch Size vs Learning Rate
Chapter 7: Weight Initialization
Lecture 1: Normal Distribution
Lecture 2: What happens when all weights are initialized to the same value?
Lecture 3: Xavier Initialization
Lecture 4: He Norm Initialization
Lecture 5: Practical Weight Initialization Note
Chapter 8: Introduction to PyTorch
Lecture 1: CODE FOR THIS COURSE
Lecture 2: Computation Graphs and Deep Learning Frameworks
Lecture 3: Installing PyTorch and an Introduction
Lecture 4: How PyTorch Works
Lecture 5: Torch Tensors – Part 1
Lecture 6: Torch Tensors – Part 2
Lecture 7: Numpy Bridge, Tensor Concatenation and Adding Dimensions
Lecture 8: Automatic Differentiation
Lecture 9: Loss Functions in PyTorch
Lecture 10: Weight Initialization in PyTorch
Chapter 9: Data Augmentation
Lecture 1: 1_Introduction to Data Augmentation
Lecture 2: 2_Data Augmentation Techniques Part 1
Lecture 3: 2_Data Augmentation Techniques Part 2
Lecture 4: 2_Data Augmentation Techniques Part 3
Chapter 10: Practical Neural Networks in PyTorch – Application 1: Diabetes
Lecture 1: Download the Dataset
Lecture 2: Part 1: Data Preprocessing
Lecture 3: Part 2: Data Normalization
Lecture 4: Part 3: Creating and Loading the Dataset
Lecture 5: Part 4: Building the Network
Lecture 6: Part 5: Training the Network
Chapter 11: Visualize the Learning Process
Lecture 1: Visualize Learning Part 1
Lecture 2: Visualize Learning Part 2
Lecture 3: Visualize Learning Part 3
Lecture 4: Visualize Learning Part 4
Lecture 5: Visualize Learning Part 5
Lecture 6: Visualize Learning Part 6
Lecture 7: Neural Networks Playground
Chapter 12: Implementing a Neural Network from Scratch with Numpy
Instructors
-
Fawaz Sammani
Computer Vision Researcher
Rating Distribution
- 1 stars: 45 votes
- 2 stars: 57 votes
- 3 stars: 232 votes
- 4 stars: 742 votes
- 5 stars: 1313 votes
Frequently Asked Questions
How long do I have access to the course materials?
You can view and review the lecture materials indefinitely, like an on-demand channel.
Can I take my courses with me wherever I go?
Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don’t have an internet connection, some instructors also let their students download course lectures. That’s up to the instructor though, so make sure you get on their good side!
You may also like
- Top 10 Video Editing Courses to Learn in November 2024
- Top 10 Music Production Courses to Learn in November 2024
- Top 10 Animation Courses to Learn in November 2024
- Top 10 Digital Illustration Courses to Learn in November 2024
- Top 10 Renewable Energy Courses to Learn in November 2024
- Top 10 Sustainable Living Courses to Learn in November 2024
- Top 10 Ethical AI Courses to Learn in November 2024
- Top 10 Cybersecurity Fundamentals Courses to Learn in November 2024
- Top 10 Smart Home Technology Courses to Learn in November 2024
- Top 10 Holistic Health Courses to Learn in November 2024
- Top 10 Nutrition And Diet Planning Courses to Learn in November 2024
- Top 10 Yoga Instruction Courses to Learn in November 2024
- Top 10 Stress Management Courses to Learn in November 2024
- Top 10 Mindfulness Meditation Courses to Learn in November 2024
- Top 10 Life Coaching Courses to Learn in November 2024
- Top 10 Career Development Courses to Learn in November 2024
- Top 10 Relationship Building Courses to Learn in November 2024
- Top 10 Parenting Skills Courses to Learn in November 2024
- Top 10 Home Improvement Courses to Learn in November 2024
- Top 10 Gardening Courses to Learn in November 2024