Skip to main content Skip to complementary content

Navigating the experiment interface

A tabbed interface allows you to navigate between various processes in your model training experience. The various tabs, as well as the experiment configuration panel, allow you to perform numerous tasks to help you train and optimize your model.

Toolbar

The toolbar is where you can switch between the various tabs in the interface.

In the toolbar, you can also do the following:

  • Depending on which tab you are on, you can switch between your trained models.

  • Click View configuration to further modify the experiment training, review the current version, or start configuring a new version.

Toolbar in an AutoML experiment

Toolbar in an ML experiment

Data

This tab allows you to manage the data in the experiment. When you first create your experiment, this is the only tab you see. As the experiment trains, you can switch to other tabs for model analysis.

In the Data tab, you can:

  • Select a target before training the first version.

  • Add or remove features.

  • View feature dataset insights and statistics.

  • Select a new training dataset.

Switch between Schema Schema view and Table Data view for different representations of the training dataset.

Data tab in an AutoML experiment

'Data' tab in an ML experiment before the user has run a version of the training

Models

Perform quick analysis of the training results. The Models tab allows you to quickly understand and compare the core metrics for each model. To perform more detailed model analysis, you can use the Compare and Analyze tabs.

Click a model in the Model metrics table to view:

  • Performance scores.

  • Model training summary (available with intelligent model optimization).

  • Feature importance visualizations.

  • Other visualizations specific to the experiment type.

For more information, see Performing quick model analysis.

Models tab in an AutoML experiment trained with intelligent model optimization

'Models' tab in an ML experiment, showing summary, core model metrics, and auto-generated visualizations

Compare

Compare your models in detail using embedded analytics. Make selections and customize the data presented in the dashboards to uncover insights about models.

In the Compare tab, you can:

  • Access all available model metrics and hyperparameters.

  • Compare training and holdout metrics across models.

For more information, see Comparing models.

Compare tab in an ML experiment

Comparative model analysis in ML experiment

Analyze

Dive deeper with embedded analytics for each model you train.

In the Analyze tab, you can:

  • Further analyze prediction accuracy.

  • Evaluate feature importance at a granular level.

  • View the distribution of feature data.

For more information about detailed model analysis, see Performing detailed model analysis.

Analyze tab in ML experiment

'Analyze' tab in an ML experiment, showing prediction accuracy and feature importance

Experiment configuration panel

Click Schema View configuration to expand the experiment configuration panel. With this panel expanded, you can start configuring a new version and customize it to obtain more control over the training process.

With the experiment configuration panel, you can:

  • Select a target before training the first version

  • Add or remove features

  • Configure a new version of the experiment

  • Select to change or refresh the training dataset

  • Add or remove algorithms

  • Change model optimization settings

Experiment configuration panel

Expanded customization panel in an ML experiment

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!