Jump to Content
SuperwiseHomeGuidesAPI ReferenceπŸš€ Start now it's free πŸš€
HomeGuidesAPI Reference
HomeGuidesAPI ReferenceπŸš€ Start now it's free πŸš€Superwise
HomeGuidesAPI Reference

πŸ‘‹ Introduction

  • Welcome aboard
  • Introduction to Superwise
  • Quickstart

πŸ’­ Let's talk concepts

  • Glossary overview
  • Models
  • Versions
  • Baselines
  • Segments
  • Metrics
    • Distribution metrics
    • Integrity metrics
    • Activity metrics
    • Drift calculations
    • Performance metrics
  • Policies
  • Incidents
  • Integrations
  • Activity log
  • Observability Levels

πŸ“” How-to guides

  • Capabilities overview
  • Connecting
    • 1. Register a new model
    • 2. Upload a version
    • 3. Create a segment
    • 4. Log production predictions
    • 5. Log ground truth / Labels
    • End-to-end examples
    • Collecting data from S3
  • Integrations
    • Email
    • Webhook
    • PagerDuty
    • New Relic
    • Datadog
    • Slack
  • Account management
    • Manage your users
    • Generating tokens
    • Multi-Factor Authentication
    • SSO configuration
  • Monitoring
    • Drift
    • Data Quality
    • Activity
    • Model performance
    • Custom policies
  • Sagify integration
  • MLflow integration

πŸ’» References

  • API and SDK Overview
  • REST API
    • API reference guide
  • Python SDK
    • PyPI package
    • Reference guide

☁️ Deployment

  • Deployment methods
  • Security center

πŸŽ“ Advanced

  • Consuming metrics via API
  • Retraining Notebook

πŸ’πŸ» Support

  • Help!
Powered byΒ 

Activity metrics

Log activity metrics for your data

Suggest Edits

It is vital to measure the activity level of your model and its operational metrics, as variance often correlates with potential model issues and technical bugs.

Superwise currently measures the activity level of your model, meaning the number of predictions and labels that were logged.

Updated 5 months ago


Did this page help you?