Admins can set up AI Gateway to provide Domino users a safe and streamlined way to access external Large Language Models (LLMs) hosted by LLM service providers like OpenAI or AWS Bedrock. This lets users enjoy the benefits of provider-hosted models while ensuring that they follow security best practices.
AI Gateway provides the following benefits:
-
Securely manage API keys to prevent leaks.
-
Control user access to LLMs.
-
Log LLM activity for auditing.
-
Provide data scientists with a consistent and streamlined interface to multiple LLM providers.
-
Built on top of MLflow Deployments Server for easy integration with existing MLflow projects.
AI Gateway endpoints are central to AI Gateway. Each endpoint acts as a proxy endpoint for the user, forwarding requests to a specific model defined by the endpoint. Endpoints are a managed way of securely connecting to model providers.
To create and manage AI Gateway endpoints in Domino, you can go to Endpoints > Gateway LLMs, or you can use the Domino REST API. The UI provides simple modals to configure endpoint details and permissions. You can also update or delete endpoints at any time.
Important
|
The endpointName must be unique.
|
See MLflow’s Deployment Server documentation for more information on the list of supported LLM providers and provider-specific configuration parameters.
Once an endpoint is created, authorized users can query the endpoint in any Workspace or Run using the standard MLflow Deployment Client API. For more information, see the documentation to Use Gateway LLMs.
Secure credential storage
When creating an endpoint, you will most likely need to pass a model-specific API key (such as OpenAI’s openai_api_key
) or secret access key (such as AWS Bedrock’s aws_secret_access_key
). When you create an endpoint, all of these keys are automatically stored securely in Domino’s central vault service and are never exposed to users when they interact with AI Gateway endpoints.
The secure credential store helps prevent API key leaks and provides a way to centrally manage API keys, rather than simply giving plain text key to users.
Domino logs all AI Gateway endpoint activity to Domino’s central audit system.
To see AI Gateway endpoint activity, go to Endpoints > Gateway LLMs and click on the Download logs button. This will download a txt
or json
file with all the AI Gateway endpoint activity in the past six (6) months.
You can further customize the audit events fetched by using the Domino REST API.
Learn how to use AI Gateway endpoints as a Domino user.