XGBoost Machine Learning for Data Science and Kaggle
XGBoost Machine Learning for Data Science and Kaggle, available at $49.99, has an average rating of 3.85, with 63 lectures, based on 79 reviews, and has 534 subscribers.
You will learn about How is xgboost algorithm working to predict different model targets What are the roles that decision trees play in gradient boost and Xgboost modeling Why XGBoost is so far one of the most powerful and stable machine learning methods in Kaggle contests How to explain and set appropriate Xgboost modeling parameters How to apply data exploration, cleaning and preparation for Xgboost method How to effectively implement the different types of xgboost models using the packages in Python How to perform feature engineering in Xgboost predictive modeling How to conduct statistical analysis and feature selection in Xgboost modeling How to explain and select the typical evaluation measures and model objectives for building Xgboost models How to perform cross validation and determine the best parameter thresholds How to proceed parameter tuning in Xgboost model building How to successfully apply Xgboost into solving various machine learning problems This course is ideal for individuals who are Anyone who enjoys the Kaggle contests or Anyone who wishes to learn how to apply machine learning and data science approaches into business It is particularly useful for Anyone who enjoys the Kaggle contests or Anyone who wishes to learn how to apply machine learning and data science approaches into business.
Enroll now: XGBoost Machine Learning for Data Science and Kaggle
Summary
Title: XGBoost Machine Learning for Data Science and Kaggle
Price: $49.99
Average Rating: 3.85
Number of Lectures: 63
Number of Published Lectures: 63
Number of Curriculum Items: 63
Number of Published Curriculum Objects: 63
Original Price: $89.99
Quality Status: approved
Status: Live
What You Will Learn
- How is xgboost algorithm working to predict different model targets
- What are the roles that decision trees play in gradient boost and Xgboost modeling
- Why XGBoost is so far one of the most powerful and stable machine learning methods in Kaggle contests
- How to explain and set appropriate Xgboost modeling parameters
- How to apply data exploration, cleaning and preparation for Xgboost method
- How to effectively implement the different types of xgboost models using the packages in Python
- How to perform feature engineering in Xgboost predictive modeling
- How to conduct statistical analysis and feature selection in Xgboost modeling
- How to explain and select the typical evaluation measures and model objectives for building Xgboost models
- How to perform cross validation and determine the best parameter thresholds
- How to proceed parameter tuning in Xgboost model building
- How to successfully apply Xgboost into solving various machine learning problems
Who Should Attend
- Anyone who enjoys the Kaggle contests
- Anyone who wishes to learn how to apply machine learning and data science approaches into business
Target Audiences
- Anyone who enjoys the Kaggle contests
- Anyone who wishes to learn how to apply machine learning and data science approaches into business
The future world is the AI era of machine learning, so mastering the application of machine learning is equivalent to getting a key to the future career. If you can only learn one tool or algorithm for machine learning or building predictive models now, what is this tool? Without a doubt, that is Xgboost! If you are going to participate in a Kaggle contest, what is your preferred modeling tool? Again, the answer is Xgboost! This is proven by countless experienced data scientists and new comers. Therefore, you must register for this course!
The Xgboost is so famous in Kaggle contests because of its excellent accuracy, speed and stability. For example, according to the survey, more than 70% the top kaggle winners said they have used XGBoost.
The Xgboost is really useful and performs manifold functionalities in the data science world; this powerful algorithm is so frequently utilized to predict various types of targets – continuous, binary, categorical data, it is also found Xgboost very effective to solve different multiclass or multilabel classification problems. In addition, the contests on Kaggle platform covered almost all the applications and industries in the world, such as retail business, banking, insurance, pharmaceutical research, traffic control and credit risk management.
The Xgboost is powerful, but it is not that easy to exercise it full capabilities without expert’s guidance. For example, to successfully implement the Xgboost algorithm, you also need to understand and adjust many parameter settings. For doing so, I will teach you the underlying algorithm so you are able to configure the Xgboost that tailor to different data and application scenarios. In addition, I will provide intensive lectures on feature engineering, feature selection and parameters tuning aiming at Xgboost. So, after training you should also be able to prepare the suitable data or features that can well feed the XGBoost model.
This course is really practical but not lacking in theory; we start from decision trees and its related concepts and components, transferring to constructing the gradient boot methods, then leading to the Xgboost modeling. The math and statistics are mildly applied to explain the mechanisms in all machine learning methods. We use the Python pandas data frames to deal with data exploration and cleaning. One significant feature of this course is that we have used many Python program examples to demonstrate every single knowledge point and skill you have learned in the lecture.
Course Curriculum
Chapter 1: Introduction
Lecture 1: What am I teaching in this course
Lecture 2: introduction of XGBoost modeling
Lecture 3: Walk through gradient boost machine
Lecture 4: Introduce advantages and applications of XGBoost (1)
Lecture 5: Introduce advantages and applications of XGBoost (2)
Lecture 6: Introduce advantages and applications of XGBoost (3)
Lecture 7: Introduce advantages and applications of XGBoost (4)
Chapter 2: Decision tree and implementation
Lecture 1: Overview of decision tree modeling
Lecture 2: Explain the concepts and components in decision trees
Lecture 3: Understand the framework of decision trees
Lecture 4: Introduction on decision tree nodes split and growth
Lecture 5: Explain how to construct decision tree by examples
Lecture 6: Learn Gini and split rules in decision tree modeling
Lecture 7: Understand decision tree in terms of two dimensional hyperplane plot
Lecture 8: Understand decision tree classifier and regressor (1)
Lecture 9: Understand decision tree classifier and regressor (2)
Lecture 10: Learn model performance measures (1)
Lecture 11: Learn model performance measures (2)
Lecture 12: Introduction of Anaconda Installation
Lecture 13: The Python programming code and data used in this course
Lecture 14: Implement decision tree modeling in Python (1)
Lecture 15: Implement decision tree modeling in Python (2)
Lecture 16: Implement decision tree modeling in Python (3)
Chapter 3: Create gradient boost machine using decision trees
Lecture 1: Understand some weakness about decision tree (1)
Lecture 2: Understand some weakness about decision tree (2)
Lecture 3: Explain how to construct gradient boost machine with decision trees
Lecture 4: Use Python to build your own gradient boost machine with decision trees
Lecture 5: Understand why gradient boost machine is more advanced than decision trees
Chapter 4: Introduce XGBoost method and application
Lecture 1: Create first XGBoost model in Python
Lecture 2: Lecture on the explanation of XGBoost’s parameters (1)
Lecture 3: Lecture on the explanation of XGBoost’s parameters (2)
Lecture 4: Lecture on the explanation of XGBoost’s parameters (3)
Lecture 5: Lecture on the explanation of XGBoost’s parameters (4)
Lecture 6: Lecture on the explanation of XGBoost’s parameters (5)
Lecture 7: Lecture on the explanation of XGBoost’s parameters (6)
Lecture 8: Lecture on the explanation of XGBoost’s parameters (7)
Lecture 9: Build XGBoostClassifier for credit risk score card using Python (1)
Lecture 10: Build XGBoostClassifier for credit risk score card using Python (2)
Lecture 11: Build XGBoostClassifier for credit risk score card using Python (3)
Lecture 12: Lecture on the XGBoost’s fit method and native XGBoost booster (1)
Lecture 13: Lecture on the XGBoost’s fit method and native XGBoost booster (2)
Lecture 14: Implement native XGBoost booster in Python by examples
Chapter 5: Advanced topics on XGBoost algorithm
Lecture 1: Introduction of XGBoost algorithm for multi-classification solutions
Lecture 2: Use case of XGBoost for predicting ordinal model objectives in Python
Lecture 3: Use case of XGBoost for predicting multi-categorical model objectives
Lecture 4: Overview of feature importance and application for XGBoost modeling
Lecture 5: Python programs on feature importance and feature selection in XGBoost (1)
Lecture 6: Python programs on feature importance and feature selection in XGBoost (2)
Lecture 7: Introduce Parameter tuning methods in XGBoost modeling
Lecture 8: Introduce online sales forecasting project with XGBoost modeling
Lecture 9: Lecture on the Python program of online sales XGBoost modeling (1)
Lecture 10: Lecture on the Python program of online sales XGBoost modeling (2)
Lecture 11: Lecture on the Python program of online sales XGBoost modeling (3)
Lecture 12: Lecture on the Python program of online sales XGBoost modeling (4)
Lecture 13: Lecture on the Python program of online sales XGBoost modeling (5)
Chapter 6: Summary of XGBoost modeling and important things
Lecture 1: What you have learned in this course and some supplementary materials
Lecture 2: Summary on feature engineering in XGBoost modeling
Lecture 3: Summary on feature standardization in XGBoost modeling
Lecture 4: Summary on handling categorical and missing data in XGBoost modeling
Lecture 5: Summary on feature selection in XGBoost modeling
Lecture 6: Summary on training and validating model in XGBoost modeling
Lecture 7: Summary on parameters tuning and model persistence in XGBoost
Lecture 8: Show Python program for XGBoost model persistence
Instructors
-
Shenggang Li
Senior Data Scientist
Rating Distribution
- 1 stars: 1 votes
- 2 stars: 8 votes
- 3 stars: 20 votes
- 4 stars: 25 votes
- 5 stars: 25 votes
Frequently Asked Questions
How long do I have access to the course materials?
You can view and review the lecture materials indefinitely, like an on-demand channel.
Can I take my courses with me wherever I go?
Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don’t have an internet connection, some instructors also let their students download course lectures. That’s up to the instructor though, so make sure you get on their good side!
You may also like
- Top 10 Language Learning Courses to Learn in November 2024
- Top 10 Video Editing Courses to Learn in November 2024
- Top 10 Music Production Courses to Learn in November 2024
- Top 10 Animation Courses to Learn in November 2024
- Top 10 Digital Illustration Courses to Learn in November 2024
- Top 10 Renewable Energy Courses to Learn in November 2024
- Top 10 Sustainable Living Courses to Learn in November 2024
- Top 10 Ethical AI Courses to Learn in November 2024
- Top 10 Cybersecurity Fundamentals Courses to Learn in November 2024
- Top 10 Smart Home Technology Courses to Learn in November 2024
- Top 10 Holistic Health Courses to Learn in November 2024
- Top 10 Nutrition And Diet Planning Courses to Learn in November 2024
- Top 10 Yoga Instruction Courses to Learn in November 2024
- Top 10 Stress Management Courses to Learn in November 2024
- Top 10 Mindfulness Meditation Courses to Learn in November 2024
- Top 10 Life Coaching Courses to Learn in November 2024
- Top 10 Career Development Courses to Learn in November 2024
- Top 10 Relationship Building Courses to Learn in November 2024
- Top 10 Parenting Skills Courses to Learn in November 2024
- Top 10 Home Improvement Courses to Learn in November 2024