6 Tweets Jan 01, 2023
Hyperparameters are numbers that the model cannot learn. So, you need to optimize them on your own.
Finding the right hyperparameters is crucial for the viability of your ML models. But it can be hard.
2 Blackbox Hyperparameter OptimizationπŸ‘‡
ML models are composed of two types of parameters:
1) Hyperparameters, set by the user before starting training
2) Model parameters, learned during the model training
2/6
You can tune the model manually. But it can be inaccurate and time-consuming.
The two most widely adopted HPO methods are random search and grid search.
It's important to know that they are both black-box problems.
3/6
1) Grid search
This technique starts with a finite set of values for each hyperparameter and then evaluates the Cartesian product of these sets.
The model with the set of parameters that gives the top accuracy is considered to be the best.
2) Random search
It samples configurations at random until a certain budget for the search is exhausted.
Random search is the best-known alternative to grid search techniques.
Both of these techniques have been implemented in the most popular deep learning frameworks and platforms.
Thanks for learning ML and AI with us! Share this thread with your friends and spread the open ML knowledge!
6/6

Loading suggestions...