Performance metrics

Performance metrics are measurements that quantitatively calculate your model’s performance. The metrics consider what the model predicted (prediction) against what actually happens (label).

The metrics supported by Superwise include: RMSE, MSE, MAE, MAPE, Accuracy, Recall, Precision, F1, Log Loss and ROC AUC.
These scores are calculated from the time the metric was created, and not historically.

πŸ‘

Pro tip

Configure your model's performance metrics as soon as you connect it to Superwise.

You can use these metrics, depending on the type of your label and prediction, as follows:

Prediction typeLabel typePossible metrics
BooleanBooleanAccuracy, Error-rate, Recall, Precision, F1
CategoricalCategoricalAccuracy, Error-rate, Recall, Precision, F1
BooleanCategoricalAccuracy,Error-rate, Recall, Precision, F1
CategoricalBooleanAccuracy, Error-rate, Recall, Precision, F1
NumericNumericRMSE, MSE, MAE, MAPE
NumericBooleanLog Loss, ROC AUC
NumericCategoricalLog Loss, ROC AUC

πŸ“˜

Read more

For more information about how to configure performance metrics: Configure performance metric