Configure your model's performance metrics as soon as you connect it to Superwise.
- Go to
Metricsscreen -> configure metrics -> performance metrics
- Give the metric a name and select the model you want to calculate the drift on
- Select the metric type (Accuracy, Error rate, Recall, etc.)
- Select the prediction class and the label class. When the prediction or label type is categorical, you must define the positive class. A positive class will be the category you wish to present as true. All the other categories will be presented as false.
For example -
prediction categories are [
label categories are [
Then the prediction’s positive value will be ‘Snake’ and the label’s positive value will be ‘no legs’.
If you want your positive value to be ‘Dog and Cat’ and the prediction’s positive value to be ‘four legs’, you will
need to create two performance groups: one for the cat and one for the dog.
After you set the metric, you will be able to use it in your monitoring policies. Keep in mind that you can also add the metric when you create a new policy.
For more information about the performance metrics concept : Performance metrics
Updated 7 months ago