Skip to main content Skip to complementary content

Viewing model scores

The model view shows a collection of insights from the training of machine learning algorithms. Metrics and charts let you compare different models within an experiment version or between different experiment versions. Review the metrics to find out how well the model performed and how you can refine the model to improve the score.

The best fitting algorithm for an experiment version is automatically selected and is marked with Trophy. The rating is based on the F1 score in classification models and the R2 score in regression models.

The model view with metrics table and chart visualizations lets you review model performance

The AutoML model view.

Showing metrics in table

Different metrics are available depending on the type of machine learning problem.

  1. In the top right of the Model metrics table, click Column picker.

  2. Select the metrics you want to show.

Viewing model metrics

There are multiple filters you can use when viewing model metrics. These are located directly above the Model metrics table, and can be toggled on or off.

To view the metrics for the top-performing model only, click Show only top model. Alternatively, you can view metrics for the selected model only by clicking Show only selected model.

For either of the above two filters, you can also further refine your results to include only deployed models by clicking Show only deployed models.

Reviewing a model

Model metrics table with 'Show only top model' and 'Show only deployed models' options enabled.

Comparing holdout scores and training scores

The metrics displayed in the model view are based on the automatic holdout data that is used to validate model performance after training. You can also view the training metrics that were generated during the cross-validation and compare with the holdout metrics. These scores will often be similar, but if they vary significantly there is likely a problem with data leakage or overfitting.

  1. In the Model metrics table, select a model.

  2. Click Show training data metrics.

    The training metrics are shown in the table and are marked with a "t".

Show training data metrics

Show training data metrics

Viewing hyperparameter values

You can view the hyperparameter values used for training by each algorithm. For more information about hyperparameters, see Hyperparameter optimization.

  • In the Model metrics table, click Hyperparameters in the Hyperparameters column.

    The hyperparameter values are shown in a pop-up window.

View hyperparameter values

View hyperparameter values

Viewing experiment configuration

The Experiment configuration pane shows the experiment version configuration for the currently selected model.

  1. Select a model in an experiment version.

  2. Click Configuration pane to open the Experiment configuration pane.

Experiment configuration

Experiment configuration pane

Experiment configuration pane

Experiment configuration pane

Learn more