Troubleshooting

“We can not solve our problems with the same level of thinking that created them.”
― Albert Einstein

Uploading dataset fails

It is important to note that your dataset file should follow the following:

  • The sum of the size of all baseline data files together should be up to 100MB
  • The dataset must contain columns for ID and Timestamp (see entities)
  • The first row should contain the feature/entity name

Here's an example of what a dataset should look like, where

  • Features are ׳Device׳, ׳Age׳ and ׳Gender׳ are features
  • Prediction is: ׳is_fraud׳
  • Label (ground truth) is: ׳is_fraudlant׳
  • ID is: ׳ID׳
  • Timestamp is: ׳Timestamp׳ (supported type: yyyy-mm-dd hh:mm:ss.SSS)

ID

Timestamp

Device

Age

Gender

is_fraud

Is_fraudlant

26546

2022-04-11 08:50:33

Mobile

26

Female

1

True

26547

2022-04-11 08:51:25

Desktop

22

Male

0

False

Data ingestion fails

The most common use cases where data ingestion failures occur when the schema doesn't match. Make sure you use the same schema used in the uploaded dataset.

Superwise lets you track and monitor your logged data transactions. You can also track any failed transactions by setting an alert mechanism that will send an appropriate notification message to one of the suggested integration channels.

18001800

Configure alert mechanism

Choose how many failed transactions should occur in the selected time period before you will be notified.

18001800

👍

Pro tip

Configure the alert mechanism to match your data sending method. If you send data in stream (records), getting alerts for a certain number of failed transactions may be less relevant than when you send a data file.

Notifications log

You can keep track of your failed transactions in the notification log. Another option is to select Show in list to filter the relevant failed transactions.

18001800

Is there a way to see the record/prediction itself?

We don't keep the raw data available (for security reasons), but use it only for aggregation and metric calculations purposes.

I config Segments but it doesn’t work. All segment shows zero data

Analysis for segments starts from the moment you create them and are not applied retrospectively to historical data.
So that from the moment of creation until relevant production data is logged into Superwise, you will see 0 as the number of predictions under that segment.

I logged production data into superwise and can't see it (quantity is zero)

There are several optional reasons for such behavior:

  1. It might be that the transaction failed - you can see how to find out here
  2. You log data in batches, and it hasn't yet arrived to Superwise.
  3. It might take up to several minutes for the data to be updated in the UI - wait a few minutes and refresh the screen

Did this page help you?