Domino users can use the AI Gateway to easily access multiple external Large Language Model (LLM) providers in their Workspaces and Runs. Using AI Gateway helps ensure that users follow security and auditability best practices.
AI Gateway provides the following benefits:
-
Securely store and manage API keys to avoid accidentally leaking sensitive information.
-
Provides a streamlined and consistent interface for multiple LLM providers to quickly compare results.
Your Domino admin must create AI Gateway endpoints before you can use them.
The easiest way to use the AI Gateway is from within a Domino Workspace.
To see the AI Gateway endpoints available to use in your Workspace, create a new Workspace then open the Gateway Model APIs side panel.
To query an endpoint:
-
Click the Copy Code icon.
-
Paste the code in your Workspace and adjust the query to fit your needs.
Alternatively, you can use the MLflowDeployment Client API to create your own query.
Note
|
When using the MLflow Deployment Client, Domino only supports the predict() API endpoint. To fetch an endpoint or list all endpoints, use Domino’s Public API instead.
|
Learn how to create AI Gateway endpoints as a Domino admin.