Configure performance metric

👍

Pro tip!

Configure your model's performance metrics as soon as you connect it to Superwise.

Configure performance metric

  1. Go to Metrics screen -> configure metrics -> performance metrics
  2. Give the metric a name and select the model you want to calculate the drift on
  3. Select the metric type (Accuracy, Error rate, Recall, etc.)
  4. Select the prediction class and the label class. When the prediction or label type is categorical, you must define the positive class. A positive class will be the category you wish to present as true. All the other categories will be presented as false.

For example -
if the prediction categories are [Dog, Cat, Snake]
and the label categories are [four legs, no legs]
Then the prediction’s positive value will be ‘Snake’ and the label’s positive value will be ‘no legs’.
If you want your positive value to be ‘Dog and Cat’ and the prediction’s positive value to be ‘four legs’, you will
need to create two performance groups: one for the cat and one for the dog.

After you set the metric, you will be able to use it in your monitoring policies. Keep in mind that you can also add the metric when you create a new policy.

📘

Read more

For more information about the performance metrics concept : Performance metrics