Master Decision Trees and Random Forests with Scikit-learn
Master Decision Trees and Random Forests with Scikit-learn, available at $39.99, has an average rating of 4.3, with 101 lectures, 22 quizzes, based on 5 reviews, and has 1223 subscribers.
You will learn about Learn how decision trees and random forests make their predictions. Learn how to use Scikit-learn for prediction with decision trees and random forests and for understanding the predictive structure of data sets. Predict purchases and prices with decision trees and random forests. Learn about each parameter of Scikit-learn’s methods DecisonTreeClassifier and RandomForestClassifier to define your decision tree or random forest. Learn using the output of Scikit-learn’s DecisonTreeClassifier and RandomForestClassifier methods to investigate and understand your predictions. Learn about how to work with imbalanced class values in the data and how noisy data can affect random forests’ prediction performance. Growing decision trees: node splitting, node impurity, Gini diversity, entropy, mean squared and absolute error, Poisson deviance, feature thresholds. Improving decision trees: cross-validation, grid/randomized search, tuning and minimal cost-complexity pruning, evaluating feature importance. Creating random forests: bootstrapping, bagging, random feature selection, decorrelation of tree predictions. Improving random forests: cross-validation, grid/randomized search, tuning, out-of-bag scoring, calibration of probability estimates. Learn to use Scikit-learn’s methods DecisonTreeRegressor and RandomForestRegressor to fit and improve your regression decision tree or random forest. This course is ideal for individuals who are Professionals, students, anybody who wants to use decision trees and random forests for making predictions with data. or Professionals, students, anybody who works with data on projects and wants to know more about decision trees or random forest after an initial experience using them. or Professionals, students, anybody interested in doing prediction projects with the Python Scikit-learn library using decision trees or random forests. It is particularly useful for Professionals, students, anybody who wants to use decision trees and random forests for making predictions with data. or Professionals, students, anybody who works with data on projects and wants to know more about decision trees or random forest after an initial experience using them. or Professionals, students, anybody interested in doing prediction projects with the Python Scikit-learn library using decision trees or random forests.
Enroll now: Master Decision Trees and Random Forests with Scikit-learn
Summary
Title: Master Decision Trees and Random Forests with Scikit-learn
Price: $39.99
Average Rating: 4.3
Number of Lectures: 101
Number of Quizzes: 22
Number of Published Lectures: 101
Number of Published Quizzes: 22
Number of Curriculum Items: 123
Number of Published Curriculum Objects: 123
Original Price: $29.99
Quality Status: approved
Status: Live
What You Will Learn
- Learn how decision trees and random forests make their predictions.
- Learn how to use Scikit-learn for prediction with decision trees and random forests and for understanding the predictive structure of data sets.
- Predict purchases and prices with decision trees and random forests.
- Learn about each parameter of Scikit-learn’s methods DecisonTreeClassifier and RandomForestClassifier to define your decision tree or random forest.
- Learn using the output of Scikit-learn’s DecisonTreeClassifier and RandomForestClassifier methods to investigate and understand your predictions.
- Learn about how to work with imbalanced class values in the data and how noisy data can affect random forests’ prediction performance.
- Growing decision trees: node splitting, node impurity, Gini diversity, entropy, mean squared and absolute error, Poisson deviance, feature thresholds.
- Improving decision trees: cross-validation, grid/randomized search, tuning and minimal cost-complexity pruning, evaluating feature importance.
- Creating random forests: bootstrapping, bagging, random feature selection, decorrelation of tree predictions.
- Improving random forests: cross-validation, grid/randomized search, tuning, out-of-bag scoring, calibration of probability estimates.
- Learn to use Scikit-learn’s methods DecisonTreeRegressor and RandomForestRegressor to fit and improve your regression decision tree or random forest.
Who Should Attend
- Professionals, students, anybody who wants to use decision trees and random forests for making predictions with data.
- Professionals, students, anybody who works with data on projects and wants to know more about decision trees or random forest after an initial experience using them.
- Professionals, students, anybody interested in doing prediction projects with the Python Scikit-learn library using decision trees or random forests.
Target Audiences
- Professionals, students, anybody who wants to use decision trees and random forests for making predictions with data.
- Professionals, students, anybody who works with data on projects and wants to know more about decision trees or random forest after an initial experience using them.
- Professionals, students, anybody interested in doing prediction projects with the Python Scikit-learn library using decision trees or random forests.
The lessons of this course help you mastering the use of decision trees and random forests for your data analysis projects. You will learn how to address classification and regression problems with decision trees and random forests. The course focuses on decision tree classifiers and random forest classifiers because most of the successful machine learning applications appear to be classification problems. The lessons explain:
-
Decision trees for classification and regression problems.
-
Elements of growing decision trees.
-
The sklearn parameters to define decision tree classifiers and regressors.
-
Prediction with decision trees using Scikit-learn (fitting, pruning/tuning, investigating).
-
The sklearn parameters to define random forest classifiers and regressors.
-
Prediction with random forests using Scikit-learn (fitting, tuning, investigating).
-
The ideas behind random forests for prediction.
-
Characteristics of fitted decision trees and random forests.
-
Importance of data and understanding prediction performance.
-
How you can carry out a prediction project using decision trees and random forests.
Focusing on classification problems, the course uses the DecisionTreeClassifier and RandomForestClassifier methods of Python’s Scikit-learn library to explain all the details you need for understanding decision trees and random forests. It also explains and demonstrates Scikit-learn’s DecisionTreeRegressor and RandomForestRegressor methods to adress regression problems. It prepares you for using decision trees and random forests to make predictions and understanding the predictive structure of data sets.
This is what is inside the lessons:
This course is for people who want to use decision trees or random forests for prediction with Scikit-learn. This requires practical experience and the course facilitates you with Jupyter notebooks to review and practice the lessons’ topics.
Each lesson is a short video to watch. Most of the lessons explain something about decision trees or random forests with an example in a Jupyter notebook. The course materials include more than 50 Jupyter notebooks and the corresponding Python code. You can download the notebooks of the lessons for review. You can also use the notebooks to try other definitions of decision trees and random forests or other data for further practice.
What students commented on this course:
-
Valuable information.
-
Clear explanations.
-
Knowledgeable instructor.
-
Helpful practice activities.
Course Curriculum
Chapter 1: Classification and Decision Trees
Lecture 1: Introduction
Lecture 2: Software
Lecture 3: Study guide
Lecture 4: Classification
Lecture 5: Purposes of classification
Lecture 6: Classification and decision trees
Lecture 7: End of this section
Chapter 2: Decision Trees
Lecture 1: Introduction
Lecture 2: Introduction to decision trees
Lecture 3: Data partitioning
Lecture 4: Learning
Lecture 5: An additional node split
Lecture 6: Impurity
Lecture 7: Quality of node splits
Lecture 8: Another classification problem
Lecture 9: Data preparation
Lecture 10: Fitting the tree
Lecture 11: Plotting the tree
Lecture 12: Binary splits
Lecture 13: The Gini diversity index
Lecture 14: Growing a decision tree
Lecture 15: A note on the RandomForestClassifier
Lecture 16: The DecisionTreeClassifier method
Lecture 17: The criterion parameter
Lecture 18: The splitter parameter
Lecture 19: The max_depth parameter
Lecture 20: The min_samples_split parameter
Lecture 21: The min_samples_leaf parameter
Lecture 22: The class_weight parameter
Lecture 23: The min_weight_fraction parameter
Lecture 24: The random_state parameter
Lecture 25: The max_features parameter
Lecture 26: The max_leaf_nodes parameter
Lecture 27: The min_impurity_decrease parameter
Lecture 28: The ccp_alpha parameter
Lecture 29: Minimal cost-complexity pruning
Lecture 30: Prediction with a classification tree
Lecture 31: Cross-validation and prediction
Lecture 32: Pruning a tree and prediction
Lecture 33: Tuning and cross-validation
Lecture 34: Pruning a tree with ‘optimized’ parameters
Lecture 35: Feature importance
Lecture 36: Attributes of DecisionTreeClassifier
Lecture 37: The tree_ object of DecisionTreeClassifier
Lecture 38: Advantages and disadvantages of decision trees
Lecture 39: End of this section
Chapter 3: Random Forests
Lecture 1: Introduction
Lecture 2: A bootstrap example
Lecture 3: Bagging 15 classification trees
Lecture 4: Random forests and decorrelation
Lecture 5: The RandomForestClassifier method
Lecture 6: The n_estimators parameter
Lecture 7: The bootstrap and oob_score parameters
Lecture 8: The max_samples parameter
Lecture 9: The warm_start parameter
Lecture 10: The n_jobs parameter
Lecture 11: The verbose parameter
Lecture 12: Tuning a random forest
Lecture 13: Attributes of the RandomForestClassifier method
Lecture 14: Advantages and disadvantages of Random Forests
Lecture 15: Random forests and logistic regression
Lecture 16: Random forests and probabilities
Lecture 17: Weighted random forests and imbalanced data
Lecture 18: Over-sampling and under-sampling
Lecture 19: Balanced random forests
Lecture 20: Random forests and noise in features
Lecture 21: Random forests and noise in class values
Lecture 22: End of this section
Chapter 4: Application: online purchases
Lecture 1: Introduction
Lecture 2: Why predicting?
Lecture 3: Available data
Lecture 4: A closer look at the data set
Lecture 5: A closer look at the analytics information
Lecture 6: Fitting and pruning a decision tree
Instructors
-
Wim Koevoets
Applied statistics, econometrics, data science
Rating Distribution
- 1 stars: 0 votes
- 2 stars: 1 votes
- 3 stars: 0 votes
- 4 stars: 2 votes
- 5 stars: 2 votes
Frequently Asked Questions
How long do I have access to the course materials?
You can view and review the lecture materials indefinitely, like an on-demand channel.
Can I take my courses with me wherever I go?
Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don’t have an internet connection, some instructors also let their students download course lectures. That’s up to the instructor though, so make sure you get on their good side!
You may also like
- Top 10 Video Editing Courses to Learn in November 2024
- Top 10 Music Production Courses to Learn in November 2024
- Top 10 Animation Courses to Learn in November 2024
- Top 10 Digital Illustration Courses to Learn in November 2024
- Top 10 Renewable Energy Courses to Learn in November 2024
- Top 10 Sustainable Living Courses to Learn in November 2024
- Top 10 Ethical AI Courses to Learn in November 2024
- Top 10 Cybersecurity Fundamentals Courses to Learn in November 2024
- Top 10 Smart Home Technology Courses to Learn in November 2024
- Top 10 Holistic Health Courses to Learn in November 2024
- Top 10 Nutrition And Diet Planning Courses to Learn in November 2024
- Top 10 Yoga Instruction Courses to Learn in November 2024
- Top 10 Stress Management Courses to Learn in November 2024
- Top 10 Mindfulness Meditation Courses to Learn in November 2024
- Top 10 Life Coaching Courses to Learn in November 2024
- Top 10 Career Development Courses to Learn in November 2024
- Top 10 Relationship Building Courses to Learn in November 2024
- Top 10 Parenting Skills Courses to Learn in November 2024
- Top 10 Home Improvement Courses to Learn in November 2024
- Top 10 Gardening Courses to Learn in November 2024