Jump to Content
SUPERWISE DocsHomeGuidesAPI Reference🚀 Start now it's free 🚀
HomeGuidesAPI Reference
HomeGuidesAPI Reference🚀 Start now it's free 🚀SUPERWISE Docs
Guides
HomeGuidesAPI Reference

👋 Introduction

  • Welcome aboard
  • Introduction to Superwise
  • Quickstart

💭 Let's talk concepts

  • Glossary overview
  • Models
  • Versions
  • Baselines
  • Segments
  • Metrics
    • Distribution metrics
    • Integrity metrics
    • Activity metrics
    • Drift calculations
    • Performance metrics
  • Policies
  • Incidents
  • Integrations
  • Transactions
  • Observability Levels

📔 How-to guides

  • Capabilities overview
  • Connecting
    • 1. Register a new model
    • 2. Upload a version
    • 3. Create a segment
    • 4. Log production predictions
    • 5. Log ground truth / Labels
    • End-to-end examples
    • Collecting data from S3
  • Integrations
    • Email
    • Webhook
    • PagerDuty
    • New Relic
    • Datadog
    • Slack
  • Account management
    • Manage your users
    • Generating tokens
    • Multi-Factor Authentication
    • SSO configuration
  • Monitoring
    • Drift
    • Data Quality
    • Activity
    • Model performance
    • Custom policies
  • Sagify Integration

💻 References

  • API and SDK Overview
  • REST API
    • API reference guide
  • Python SDK
    • PyPI package
    • Reference guide

☁️ Deployment

  • Deployment methods
  • Security center

🎓 Advanced

  • Consuming metrics via API
  • Retraining Notebook

💁🏻 Support

  • Help!
Powered by 

Activity metrics

Log activity metrics for your data

Suggest Edits
1800

It is vital to measure the activity level of your model and its operational metrics, as variance often correlates with potential model issues and technical bugs.

Superwise currently measures the activity level of your model, meaning the number of predictions and labels that were logged.

Updated over 3 years ago