Skip to main content Skip to complementary content

Hyperparameter optimization

Machine learning models require different constraints, weights, or learning rates to generalize different data patterns. These measures are called hyperparameters and are used to control the learning process. The hyperparameters need to be tuned so that the model can optimally solve the machine learning problem.

By default, AutoML uses a predefined set of hyperparameter values for each algorithm used in model training. These are the standard, optimized values that are generally accepted by the data science community. The top-performing combination of hyperparameter values are automatically selected.

However, in some cases, you might want to fine-tune the model for optimal predictive results. This can be done using hyperparameter optimization (HPO).

AutoML runs hyperparameter optimization only for the best algorithm of the set you selected in the experiment configuration. First, a random search is run, and then a grid search to find the best hyperparameters for the algorithm.

There are a few important things to consider before using hyperparameter optimization:

  • Don't use hyperparameter optimization the first time you train a model. It is designed to be used after you have trained your model and are content with the results. That process often requires repeated refinement and retraining.

  • Hyperparameter optimization adds considerable time. If the training process for your model takes five minutes using the standard predefined hyperparameter values, training that same model with hyperparameter optimization enabled could take hours.

Algorithms used in hyperparameter optimization

Hyperparameter optimization is limited to specific algorithms and model types. It is specifically designed to work with the following model types and algorithms:

  • Binary Classification models

    • CatBoost Classification

    • Elastic Net Regression

    • Lasso Regression

    • LightGBM Classification

    • Logistic Regression

    • Random Forest Classification

    • XGBoost Classification

  • Regression models

    • CatBoost Regression

    • LightGBM Regression

    • Random Forest Regression

    • XGBoost Regression

Enabling hyperparameter optimization

You enable hyperparameter optimization in the Experiment configuration pane under Model optimization.

Viewing hyperparameter values

To view the hyperparameter values used for a model, click Hyperparameters in the Hyperparameters column in the Model metrics table.

Learn more

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!