Supervised Machine Learning in Python
Supervised Machine Learning in Python, available at $64.99, has an average rating of 4.15, with 79 lectures, based on 25 reviews, and has 218 subscribers.
You will learn about Regression and classification models Linear models Decision trees Naive Bayes k-nearest neighbors Support Vector Machines Neural networks Random Forest Gradient Boosting XGBoost Voting Stacking Performance metrics (RMSE, MAPE, Accuracy, Precision, ROC Curve…) Feature importance SHAP Recursive Feature Elimination Hyperparameter tuning Cross-validation This course is ideal for individuals who are Python developers or Data Scientists or Computer engineers or Researchers or Students It is particularly useful for Python developers or Data Scientists or Computer engineers or Researchers or Students.
Enroll now: Supervised Machine Learning in Python
Summary
Title: Supervised Machine Learning in Python
Price: $64.99
Average Rating: 4.15
Number of Lectures: 79
Number of Published Lectures: 79
Number of Curriculum Items: 79
Number of Published Curriculum Objects: 79
Original Price: $29.99
Quality Status: approved
Status: Live
What You Will Learn
- Regression and classification models
- Linear models
- Decision trees
- Naive Bayes
- k-nearest neighbors
- Support Vector Machines
- Neural networks
- Random Forest
- Gradient Boosting
- XGBoost
- Voting
- Stacking
- Performance metrics (RMSE, MAPE, Accuracy, Precision, ROC Curve…)
- Feature importance
- SHAP
- Recursive Feature Elimination
- Hyperparameter tuning
- Cross-validation
Who Should Attend
- Python developers
- Data Scientists
- Computer engineers
- Researchers
- Students
Target Audiences
- Python developers
- Data Scientists
- Computer engineers
- Researchers
- Students
In this practicalcourse, we are going to focus on supervised machine learningand how to apply it in Python programming language.
Supervised machine learning is a branch of artificial intelligence whose goal is to create predictive modelsstarting from a dataset. With the proper optimization of the models, it is possible to create mathematical representationsof our data in order to extract the informationthat is hidden inside our database and use it for making inferencesand predictions.
A very powerful use of supervised machine learning is the calculation of feature importance, which makes us better understand the information behind data and allows us to reduce the dimensionalityof our problem considering only the relevant information, discarding all the useless variables. A common approach for calculating feature importance is the SHAPtechnique.
Finally, the proper optimization of a model is possible using some hyperparameter tuningtechniques that make use of cross-validation.
With this course, you are going to learn:
-
What supervised machine learning is
-
What overfitting and underfitting are and how to avoid them
-
The difference between regression and classification models
-
Linear models
-
Linear regression
-
Lasso regression
-
Ridge regression
-
Elastic Net regression
-
Logistic regression
-
-
Decision trees
-
Naive Bayes
-
K-nearest neighbors
-
Support Vector Machines
-
Linear SVM
-
Non-linear SVM
-
-
Feedforward neural networks
-
Ensemble models
-
Bias-variance tradeoff
-
Bagging and Random Forest
-
Boosting and Gradient Boosting
-
Voting
-
Stacking
-
-
Performance metrics
-
Regression
-
Root Mean Squared Error
-
Mean Absolute Error
-
Mean Absolute Percentage Error
-
-
Classification
-
Confusion matrix
-
Accuracy and balanced accuracy
-
Precision
-
Recall
-
ROC Curve and the area under it
-
Multi-class metrics
-
-
-
Feature importance
-
How to calculate feature importance according to a model
-
SHAP technique for calculating feature importance according to every model
-
Recursive Feature Elimination for dimensionality reduction
-
-
Hyperparameter tuning
-
k-fold cross-validation
-
Grid search
-
Random search
-
All the lessons of this course start with a brief introduction and end with a practical example in Python programming language and its powerful scikit-learn library. The environment that will be used is Jupyter, which is a standard in the data science industry. All the Jupyter notebooks are downloadable.
Course Curriculum
Chapter 1: Introduction to supervised machine learning
Lecture 1: Introduction to the course
Lecture 2: What is supervised machine learning?
Lecture 3: Regression and classification models
Lecture 4: Overfitting and underfitting
Chapter 2: The tools used in this course
Lecture 1: Required Python packages
Lecture 2: Jupyter notebook
Lecture 3: Sklearn API
Chapter 3: Linear models
Lecture 1: Introduction to Linear Regression
Lecture 2: Linear regression in Python
Lecture 3: Introduction to Ridge Regression
Lecture 4: Ridge regression in Python
Lecture 5: Introduction to Lasso Regression
Lecture 6: Lasso regression in Python
Lecture 7: Introduction to Elastic Net Regression
Lecture 8: Elastic Net Regression in Python
Lecture 9: Introduction to Logistic Regression for classification
Lecture 10: Logistic regression in Python
Chapter 4: Decision trees
Lecture 1: Introduction to decision trees
Lecture 2: Decision trees in Python
Chapter 5: K-nearest neighbors
Lecture 1: Introduction to KNN
Lecture 2: KNN in Python
Chapter 6: Naive Bayes
Lecture 1: Introduction to Naive Bayes
Lecture 2: Categorical Naive Bayes in Python
Lecture 3: Bernoulli Naive Bayes in Python
Lecture 4: Gaussian Naive Bayes in Python
Chapter 7: Support Vector Machines
Lecture 1: Introduction to SVM
Lecture 2: Linear SVM in Python
Lecture 3: Non-linear SVM in Python
Chapter 8: Neural Networks
Lecture 1: Introduction to Neural Networks
Lecture 2: Neural Networks in Python
Chapter 9: Introduction to ensemble models
Lecture 1: Ensemble models and bias-variance tradeoff
Chapter 10: Ensemble models: bagging
Lecture 1: Introduction to bagging
Lecture 2: Bagging in Python
Lecture 3: Introduction to Random Forest
Lecture 4: Random Forest in Python
Lecture 5: Introduction to Extremely Randomized Trees
Lecture 6: Extremely Randomized Trees in Python
Chapter 11: Ensemble models: boosting
Lecture 1: Introduction to boosting
Lecture 2: Boosting in Python
Lecture 3: Introduction to Gradient Boosting
Lecture 4: Gradient Boosting in Python
Lecture 5: XGBoost in Python
Chapter 12: Ensemble models: voting
Lecture 1: Introduction to voting
Lecture 2: Voting in Python
Chapter 13: Ensemble models: stacking
Lecture 1: Introduction to stacking
Lecture 2: Stacking in Python
Chapter 14: Performance evaluation
Lecture 1: Regression performance metrics
Lecture 2: Regression performance metrics in Python
Lecture 3: Pairplot in Python
Lecture 4: Binary classification performance metrics
Lecture 5: Binary classification performance metrics in Python
Lecture 6: Introduction to ROC curve
Lecture 7: ROC curve in Python
Lecture 8: Multi-class classification performance metrics
Lecture 9: Multi-class classification performance metrics in Python
Lecture 10: When to use classification performance metrics
Chapter 15: Cross-Validation and hyperparameter tuning
Lecture 1: Introduction to k-fold cross-validation
Lecture 2: k-fold cross-validation in Python
Lecture 3: The need for hyperparameter tuning
Lecture 4: Introduction to grid search
Lecture 5: Grid search in Python
Lecture 6: Introduction to Random Search
Lecture 7: Random Search in Python
Chapter 16: Feature importance and model interpretation
Lecture 1: What is feature importance?
Lecture 2: Models that calculate feature importance in Python
Lecture 3: Introduction to SHAP
Lecture 4: Using SHAP with tree-based models in Python
Lecture 5: Using SHAP with every model in Python
Chapter 17: Recursive Feature Elimination
Lecture 1: Introduction to RFE
Lecture 2: RFE in Python
Chapter 18: Practical examples in Python
Lecture 1: A complete pipeline: model selection and hyperparameter tuning
Lecture 2: Feature selection with Lasso
Lecture 3: Dimensionality reduction with RFE
Lecture 4: How to choose the right scaler
Chapter 19: Persisting our model
Lecture 1: Pickle library
Chapter 20: Practical approaches
Lecture 1: The curse of dimensionality
Lecture 2: The importance of pre-processing
Lecture 3: The importance of the right features against the model
Lecture 4: Interpretability of a model
Instructors
-
Gianluca Malato
Your Data Teacher
Rating Distribution
- 1 stars: 1 votes
- 2 stars: 0 votes
- 3 stars: 3 votes
- 4 stars: 8 votes
- 5 stars: 13 votes
Frequently Asked Questions
How long do I have access to the course materials?
You can view and review the lecture materials indefinitely, like an on-demand channel.
Can I take my courses with me wherever I go?
Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don’t have an internet connection, some instructors also let their students download course lectures. That’s up to the instructor though, so make sure you get on their good side!
You may also like
- Top 10 Language Learning Courses to Learn in November 2024
- Top 10 Video Editing Courses to Learn in November 2024
- Top 10 Music Production Courses to Learn in November 2024
- Top 10 Animation Courses to Learn in November 2024
- Top 10 Digital Illustration Courses to Learn in November 2024
- Top 10 Renewable Energy Courses to Learn in November 2024
- Top 10 Sustainable Living Courses to Learn in November 2024
- Top 10 Ethical AI Courses to Learn in November 2024
- Top 10 Cybersecurity Fundamentals Courses to Learn in November 2024
- Top 10 Smart Home Technology Courses to Learn in November 2024
- Top 10 Holistic Health Courses to Learn in November 2024
- Top 10 Nutrition And Diet Planning Courses to Learn in November 2024
- Top 10 Yoga Instruction Courses to Learn in November 2024
- Top 10 Stress Management Courses to Learn in November 2024
- Top 10 Mindfulness Meditation Courses to Learn in November 2024
- Top 10 Life Coaching Courses to Learn in November 2024
- Top 10 Career Development Courses to Learn in November 2024
- Top 10 Relationship Building Courses to Learn in November 2024
- Top 10 Parenting Skills Courses to Learn in November 2024
- Top 10 Home Improvement Courses to Learn in November 2024