Skip to main content Skip to complementary content

Refining models

Once you have created some initial models, it is important to refine them to increase their effectiveness and potential accuracy. The model scores indicate different measures of this performance. While the goal of refining the models is to increase these scores, a higher score doesn't always indicate a better model.

By excluding or including features and changing other configuration parameters, you can compare different model versions to see what effect your changes have.

By interpreting the scores, you will learn how to refine the model. The values for the different metrics can give you insights about which actions to take to improve the outcome.

Requirements and permissions

To learn more about the user requirements for working with ML experiments, see Working with experiments.

Improving the dataset

If your model doesn't score well, you might want to review the dataset to address any issues. Read more about how to improve the dataset in Getting your dataset ready for training.

Excluding features

More features do not necessarily make a better model. To refine the model, you want to exclude unreliable and irrelevant features such as:

  • Features with too high correlation. From two correlated features, exclude the one with less feature importance.

  • Features with too low feature importance. Those features don't provide any influence on what you’re trying to learn about.

  • Features with too high feature importance. It might be due to data leakage.

Test to remove the feature from the training data, then run the training again and check if this improves the model. Does it make a big difference or none to the model score?

  1. Open an experiment from Catalog.

  2. Select the model you want to refine.

  3. In the bottom right, click Configure v2 to open the Experiment configuration pane.

    (The text on the button depends on the number of versions you have run.)

  4. Under Features, clear the checkboxes for any feature that you don’t want to use in the training.

Tip noteAlternatively, you can deselect features in the schema and data views. Click Schema view to switch to the schema view. Click Data view to switch to the data view. Return to the model view by clicking Model view.

Adding features

If your model still isn’t scoring well, it could be because the features that have a relationship with the target are not yet captured in the dataset. Read more about how to capture or engineer new features in Creating new feature columns.

Selecting algorithms

Based on the data type of your target column, suitable algorithms are automatically selected for training. You might want to exclude the algorithms that don't perform as well or are slower. This way you don't have to waste time on them for training.

For more information about how algorithms are chosen, see Algorithms.

  1. Open an experiment from Catalog.

  2. Select the model you want to refine.

  3. In the bottom right, click Configure v2 to open the Experiment configuration pane.

    (The text on the button depends on the number of versions you have run.)

  4. Under Algorithms, clear the checkboxes for any algorithms that you don’t want to use in the training.

Comparing experiment versions

Once you have made your changes, run the training again and compare the new version with the old one to see the effect of your changes.

  1. Click Run v2 in the bottom right corner of the screen to train another experiment version.

    (The text on the button depends on the number of versions you have run.)

  2. In the Model metrics table, you can filter the models using the dropdown menus for algorithm, version, and other properties. The table can also be sorted by individual metric columns.

Comparing model versions

Model metrics table showing comparison of model metrics across multiple experiment versions

Deleting experiment versions

You can delete experiment versions that you don't want to keep. Note that all models in the experiment versions will also be deleted and can't be recovered.

  1. In the Model metrics table, select a model from the experiment version you want to delete.

  2. In the bottom right, click Delete 1 version.

  3. In the confirmation dialog, click Delete.

Learn more

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!