Deploy models as REST API endpoints with a single click to make predictions (inferences) on new data. Deploy on Domino for built-in security, scaling, and management features.
Deploy models to a REST endpoint for asynchronous (batch) or synchronous (real-time) predictions on new data.
Secure Domino endpoints to decide who can access and see model endpoints.
Scale deployments vertically and horizontally to right-size your endpoint on robust platforms.
Route APIs to test and development versions.
Monitor and log your endpoint activity and health.
Score large batches of data with batch scoring jobs.
Domino gives you the flexibility to use your models and endpoints on other supported platforms.
-
Deploy models to SageMaker to leverage auto-scaling and streaming capabilities.
-
Export to NVIDIA Fleetcommand for edge deployments.
-
Export models to Snowflake to bring the model to where your data lives.
Domino can deploy natively, export to third party platforms, or integrate model deployments with existing CI/CD workflows with the Model Export API. The choice is yours.