Performance metrics
Performance metrics are measurements that quantitatively calculate your modelβs performance. The metrics consider what the model predicted (prediction) against what actually happens (label).
The metrics supported by Superwise include: RMSE
, MSE
, MAE
, MAPE
, Accuracy
, Recall
, Precision
, F1
, Log Loss
and ROC AUC
.
These scores are calculated from the time the metric was created, and not historically.
Pro tip
Configure your model's performance metrics as soon as you connect it to Superwise.
You can use these metrics, depending on the type of your label and prediction, as follows:
Prediction type | Label type | Possible metrics |
---|---|---|
Boolean | Boolean | Accuracy, Error-rate, Recall, Precision, F1 |
Categorical | Categorical | Accuracy, Error-rate, Recall, Precision, F1 |
Boolean | Categorical | Accuracy,Error-rate, Recall, Precision, F1 |
Categorical | Boolean | Accuracy, Error-rate, Recall, Precision, F1 |
Numeric | Numeric | RMSE, MSE, MAE, MAPE |
Numeric | Boolean | Log Loss, ROC AUC |
Numeric | Categorical | Log Loss, ROC AUC |
Read more
For more information about how to configure performance metrics: Configure performance metric
Updated over 1 year ago