Hyperparameter optimization
Machine learning models require different constraints, weights, or learning rates to generalize different data patterns. These measures are called hyperparameters and are used to control the learning process. The hyperparameters need to be tuned so that the model can optimally solve the machine learning problem.
By default, AutoML uses a predefined set of hyperparameter values for each algorithm used in model training. These are the standard, optimized values that are generally accepted by the data science community. The top-performing combination of hyperparameter values are automatically selected.
However, in some cases, you might want to fine-tune the model for optimal predictive results. This can be done using hyperparameter optimization (HPO).
AutoML runs hyperparameter optimization only for the best algorithm of the set you selected in the experiment configuration. First, a random search is run, and then a grid search to find the best hyperparameters for the algorithm.
There are a few important things to consider before using hyperparameter optimization:
-
Don't use hyperparameter optimization the first time you train a model. It is designed to be used after you have trained your model and are content with the results. That process often requires repeated refinement and retraining.
-
Hyperparameter optimization adds considerable time. If the training process for your model takes five minutes using the standard predefined hyperparameter values, training that same model with hyperparameter optimization enabled could take hours.
Algorithms used in hyperparameter optimization
Hyperparameter optimization is limited to specific algorithms and model types. It is specifically designed to work with the following model types and algorithms:
-
Binary Classification models
-
CatBoost Classification
-
Elastic Net Regression
-
Lasso Regression
-
LightGBM Classification
-
Logistic Regression
-
Random Forest Classification
-
XGBoost Classification
-
-
Regression models
-
CatBoost Regression
-
LightGBM Regression
-
Random Forest Regression
-
XGBoost Regression
-
Enabling hyperparameter optimization
Do the following:
-
In an experiment, click View configuration.
The experiment configuration panel opens.
-
If needed, click New version to configure the next version.
-
In the panel, expand Model optimization.
-
Switch from Intelligent to Manual.
Hyperparameter optimization is not available with intelligent model optimization.
-
Click the check box for Hyperparameter optimization.
Viewing hyperparameter values
View hyperparameter values in the Compare tab during detailed model analysis. For more information, see Performing detailed model analysis.