domino logo
Tech Ecosystem
Get started with Python
Step 0: Orient yourself to DominoStep 1: Create a projectStep 2: Configure your projectStep 3: Start a workspaceStep 4: Get your files and dataStep 5: Develop your modelStep 6: Clean up WorkspacesStep 7: Deploy your model
Get started with R
Step 0: Orient yourself to Domino (R Tutorial)Step 1: Create a projectStep 2: Configure your projectStep 3: Start a workspaceStep 4: Get your files and dataStep 5: Develop your modelStep 6: Clean up WorkspacesStep 7: Deploy your model
Get Started with MATLAB
Step 1: Orient yourself to DominoStep 2: Create a Domino ProjectStep 3: Configure Your Domino ProjectStep 4: Start a MATLAB WorkspaceStep 5: Fetch and Save Your DataStep 6: Develop Your ModelStep 7: Clean Up Your Workspace
Step 8: Deploy Your Model
Scheduled JobsLaunchers
Step 9: Working with Domino Datasets
Domino Reference
Notifications
On-Demand Open MPI
Configure MPI PrerequisitesFile Sync MPI ClustersValidate MPI VersionWork with your ClusterManage Dependencies
Projects
Projects OverviewProjects PortfolioReference ProjectsProject Goals in Domino 4+
Git Integration
Git Repositories in DominoGit-based Projects with CodeSyncWorking from a Commit ID in Git
Jira Integration in DominoUpload Files to Domino using your BrowserFork and Merge ProjectsSearchSharing and CollaborationCommentsDomino Service FilesystemComparing File RevisionsRevert Projects and Files
Advanced Project Settings
Project DependenciesProject TagsRename a ProjectSet up your Project to Ignore FilesUpload files larger than 550MBExporting Files as a Python or R PackageTransfer Project Ownership
Domino Runs
JobsDiagnostic Statistics with dominostats.jsonNotificationsResultsRun Comparison
Advanced Options for Domino Runs
Run StatesDomino Environment VariablesEnvironment Variables for Secure Credential StorageUse Apache Airflow with Domino
Scheduled Jobs
Domino Workspaces
WorkspacesUse Git in Your WorkspaceRecreate A Workspace From A Previous CommitUse Visual Studio Code in Domino WorkspacesPersist RStudio PreferencesAccess Multiple Hosted Applications in one Workspace Session
Spark on Domino
On-Demand Spark
On-Demand Spark OverviewValidated Spark VersionConfigure PrerequisitesWork with your ClusterManage DependenciesWork with Data
External Hadoop and Spark
Hadoop and Spark OverviewConnecting to a Cloudera CDH5 cluster from DominoConnecting to a Hortonworks cluster from DominoConnect to a MapR cluster from DominoConnect to an Amazon EMR cluster from DominoRunning Local Spark on a Domino ExecutorUsing PySpark in Jupyter WorkspacesKerberos Authentication
On-Demand Ray
On-Demand Ray OverviewValidated Ray VersionConfigure PrerequisitesWork with your ClusterManage DependenciesWork with Data
On-Demand Dask
On-Demand Dask OverviewValidated Dask VersionConfigure PrerequisitesWork with Your ClusterManage DependenciesWork with Data
Customize the Domino Software Environment
Environment ManagementDomino Standard EnvironmentsInstall Packages and DependenciesAdd Workspace IDEsAdding Jupyter KernelsAutomatic Adaptation of Custom Images
Partner Environments for Domino
Use MATLAB as a WorkspaceUse Stata as a WorkspaceUse SAS as a Workspace
Advanced Options for Domino Software Environment
Publish in Domino with Custom ImagesInstall Custom Packages in Domino with Git IntegrationAdd Custom DNS Servers to Your Domino EnvironmentConfigure a Compute Environment to User Private Cran/Conda/PyPi MirrorsUse TensorBoard in Jupyter Workspaces
Publish your Work
Publish a Model API
Model Publishing OverviewModel Invocation SettingsModel Access and CollaborationModel Deployment ConfigurationPromote Projects to ProductionExport Model ImageExport to NVIDIA Fleet Command
Publish a Web Application
App Publishing OverviewGet Started with DashGet Started with ShinyGet Started with FlaskContent Security Policies for Web Apps
Advanced Web Application Settings in Domino
App Scaling and PerformanceHost HTML Pages from DominoHow to Get the Domino Username of an App Viewer
Launchers
Launchers OverviewAdvanced Launcher Editor
Assets Portfolio Overview
Model Monitoring and Remediation
Monitor WorkflowsData Drift and Quality Monitoring
Set up Monitoring for Model APIs
Set up Prediction CaptureSet up Drift DetectionSet up Model Quality MonitoringSet up NotificationsSet Scheduled ChecksSet up Cohort Analysis
Set up Model Monitor
Connect a Data SourceRegister a ModelSet up Drift DetectionSet up Model Quality MonitoringSet up Cohort AnalysisSet up NotificationsSet Scheduled Checks
Use Monitoring
Access the Monitor DashboardAnalyze Data DriftAnalyze Model QualityExclude Features from Scheduled Checks
Remediation
Cohort Analysis
Review the Cohort Analysis
Remediate a Model API
Monitor Settings
API TokenHealth DashboardNotification ChannelsTest Defaults
Monitoring Config JSON
Supported Binning Methods
Model Monitoring APIsTroubleshoot the Model Monitor
Connect to your Data
Data in Domino
Datasets OverviewProject FilesDatasets Best Practices
Connect to Data Sources
External Data VolumesDomino Data Sources
Connect to External Data
Connect Domino to DataRobotConnect to Azure Data Lake StorageConnect to BigQuery from DominoConnect to Google Cloud Storage from DominoConnect to IBM DB2 from DominoConnect to IBM Netezza from DominoConnect to Impala from DominoConnect to MSSQL from DominoConnect to MySQL from DominoConnect to Okera from DominoConnect to Oracle Database from DominoConnect to PostgreSQL from DominoConnect to Redshift from DominoConnect to S3 from DominoConnect to Snowflake from DominoConnect to Teradata from Domino
Work with Data Best Practices
Work with Big Data in DominoWork with Lots of FilesMove Data Over a Network
Advanced User Configuration Settings
User API KeysDomino TokenOrganizations Overview
Use the Domino Command Line Interface (CLI)
Install the Domino Command Line (CLI)Domino CLI ReferenceDownload Files with the CLIForce-Restore a Local ProjectMove a Project Between Domino DeploymentsUse the Domino CLI Behind a Proxy
Browser Support
Get Help with Domino
Additional ResourcesGet Domino VersionContact Domino Technical SupportSupport Bundles
domino logo
About Domino
Domino Data LabKnowledge BaseData Science BlogCommunityTraining
User Guide
>
Domino Reference
>
Model Monitoring and Remediation
>
Set up Monitoring for Model APIs
>
Set Scheduled Checks

Set Scheduled Checks

Use Scheduled Checks to ensure that you are notified if data drift or model quality metrics degrade beyond the threshold for any period. For data drift, Domino uses the timestamps of the predictions to select data for the scheduled checks. For model quality, Domino uses the timestamp of the prediction data to select the prediction and ground truth data on which to run periodic model quality checks. You can specify how often to repeat the checks and the time range of the data to be used for calculations for those checks. When a check fails, emails are sent. See Set up notifications.

Prerequisites

  • You must have ground truth data available for the date ranges selected for the Schedule Checks. To do this, set up the data source and notify Domino of new ground truth data.

 

  1. From the navigation pane, click Model APIs.

  2. Click the name of the model for which you want to set up scheduled checks.

  3. Click Monitoring.

  4. Go to Configure Monitoring > Schedule.

    configure schedule

  5. Type a name for the check.

  6. Set up the frequency at which the check must run.

  7. In the Select Data to Check area, select one of the following:

    Caution
    • Use new data since last check time

      • For data drift, this checks predictions with timestamps later than the last scheduled check.

      • For model quality, this checks only ground truth labels ingested into the Model Monitor after the last scheduled check ran and matches them with historical predictions made by the model.

    • Data since last x <time period>

      • For data drift, this checks predictions whose timestamps are within the last specified interval (for example, the last three days). For model quality, this checks only the ground truth labels ingested within the last specified interval (for example, within the last three days), and matches them with historical predictions made by the model.

  8. Click Save.

Domino automatically creates data in a Domino Dataset named prediction_data for every project. When you reproduce a workspace, this data is included. Predictions are in Parquet format and are updated hourly as the Model API processes inputs. All the data captured is stored in the file and populated in a workspace so you can diagnose flaws in the model and retrain the model as needed.

Domino Data LabKnowledge BaseData Science BlogCommunityTraining
Copyright © 2022 Domino Data Lab. All rights reserved.