Boosting Machine Learning Models in Python
.MP4, AVC, 1920x1080, 30 fps | English, AAC, 2 Ch | 3h 7m | 590 MB
Instructor: Jakub Konczyk
.MP4, AVC, 1920x1080, 30 fps | English, AAC, 2 Ch | 3h 7m | 590 MB
Instructor: Jakub Konczyk
Leverage ensemble techniques to maximize your machine learning models in Python
Learn
Discover and use the main concepts behind ensemble techniques and learn why they are important in applied machine learning
Learn how to use bagging to combine predictions from multiple algorithms and predict more accurately than from any individual algorithm
Use boosting to create a strong classifier from a series of weak classifiers and improve the final performance
Explore how even a very simple ensemble technique such as voting can help you maximize performance
Also learn a powerful and less well-known stacking technique, where you combine different models with another machine learning algorithm to focus on distinctive features of your dataset for each individual model
Evaluate which ensemble technique is good for a particular problem
Train, test, and evaluate your own XGBoost models
About
Machine learning ensembles are models composed of a few other models that are trained separately and then combined in some way to make an overall prediction. These powerful techniques are often used in applied machine learning to achieve the best overall performance.
In this unique course, after installing the necessary tools you will jump straight into the bagging method so as to get the best results from algorithms that are highly sensitive to specific data—for example, algorithms based on decision trees. Next, you will discover another powerful and popular class of ensemble methods called boosting. Here you'll achieve maximal algorithm performance by training a sequence of models, where each given model improves the results of the previous one. You will then explore a much simpler technique called voting, where results from multiple models are achieved using simple statistics such as the mean average. You will also work hands-on with algorithms such as stacking and XGBoost to improve performance.
By the end of this course, you will know how to use a variety of ensemble algorithms in the real world to boost your machine learning models.
Please note that a working knowledge of Python 3; the ability to run simple commands in Shell (Terminal); and also some basic machine learning experience are core prerequisites for taking and getting the best out of this course.
The code bundle for this video course is available at - https://github.com/PacktPublishing/Boosting-Machine-Learning-Models-in-Python
Features
Discover the high-level landscape of ensemble techniques and choose the best one for your particular use case
Learn the key ideas behind each ensemble technique to quickly understand its pros and cons—all while working on real-world examples
Work with XGBoost, the most popular ensemble algorithm, to train, test, and evaluate your own ML models