Performance metrics

๐Ÿ‘

Pro tip!

Configure your model's performance metrics as soon as you connect it to Superwise.

Configure performance metric

600
  1. Go to Metrics screen -> configure metrics -> performance metrics
  2. Give the metric a name and select the model you want to calculate the drift on
  3. Select the metric type (Accuracy, Error rate, Recall, etc.)
  4. Select the prediction class and the label class. When the prediction or label type is categorical, you must define the positive class. A positive class will be the category you wish to present as true. All the other categories will be presented as false.

For example -
if the prediction categories are [Dog, Cat, Snake]
and the label categories are [four legs, no legs]
Then the predictionโ€™s positive value will be โ€˜Snakeโ€™ and the labelโ€™s positive value will be โ€˜no legsโ€™.
If you want your positive value to be โ€˜Dog and Catโ€™ and the predictionโ€™s positive value to be โ€˜four legsโ€™, you will
need to create two performance groups: one for the cat and one for the dog.

After you set the metric, you will be able to use it in your monitoring policies. Keep in mind that you can also add the metric when you create a new policy.

๐Ÿ“˜

Read more

For more information about the performance metrics concept : Performance metrics