Hyperparameter Optimization for Machine Learning
Duration: 7h 38m | .MP4 1280x720, 30 fps(r) | AAC, 44100 Hz, 2ch | 2.88 GB
Genre: eLearning | Language: English
Duration: 7h 38m | .MP4 1280x720, 30 fps(r) | AAC, 44100 Hz, 2ch | 2.88 GB
Genre: eLearning | Language: English
Learn the approaches and tools to tune hyperparameters and improve the performance of your machine learning models.
What you'll learn
Hyperparameter tunning and why it matters
Cross-validation and nested cross-validation
Hyperparameter tunning with Grid and Random search
Bayesian Optimisation
Tree-Structured Parzen Estimators, Population Based Training and SMAC
Hyperparameter tunning tools, i.e., Hyperopt, Optuna, Scikit-optimize, Keras Turner and others
Requirements
Python programming, including knowledge of NumPy, Pandas and Scikit-learn
Familiarity with basic machine learning algorithms, i.e., regression, support vector machines and nearest neighbours
Familiarity with decision tree algorithms and Random Forests
Familiarity with gradient boosting machines, i.e., xgboost, lightGBMs
Understanding of machine learning model evaluation metrics
Familiarity with Neuronal Networks
Description
Welcome to Hyperparameter Optimization for Machine Learning. In this course, you will learn multiple techniques to select the best hyperparameters and improve the performance of your machine learning models.
If you are regularly training machine learning models as a hobby or for your organization and want to improve the performance of your models, if you are keen to jump up in the leader board of a data science competition, or you simply want to learn more about how to tune hyperparameters of machine learning models, this course will show you how.
We'll take you step-by-step through engaging video tutorials and teach you everything you need to know about hyperparameter tuning. Throughout this comprehensive course, we cover almost every available approach to optimize hyperparameters, discussing their rationale, their advantages and shortcomings, the considerations to have when using the technique and their implementation in Python.
Specifically, you will learn:
What hyperparameters are and why tuning matters
The use of cross-validation and nested cross-validation for optimization
Grid search and Random search for hyperparameters
Bayesian Optimization
Tree-structured Parzen estimators
SMAC, Population Based Optimization and other SMBO algorithms
How to implement these techniques with available open source packages including Hyperopt, Optuna, Scikit-optimize, Keras Turner and others.
By the end of the course, you will be able to decide which approach you would like to follow and carry it out with available open-source libraries.
This comprehensive machine learning course includes over 50 lectures spanning about 8 hours of video, and ALL topics include hands-on Python code examples which you can use for reference and for practice, and re-use in your own projects.
So what are you waiting for? Enroll today, learn how to tune the hyperparameters of your models and build better machine learning models.
Who this course is for:
Students who want to know more about hyperparameter optimization algorithms
Students who want to understand advanced techniques for hyperparameter optimization
Students who want to learn to use multiple open source libraries for hyperparameter tuning
Students interested in building better performing machine learning models
Students interested in participating in data science competitions
Students seeking to expand their breadth of knowledge on machine learning
More Info