Self-hosting LLMs
Using third-party AI providers such as Anthropic, Google AI Studio, and OpenAI is the easiest way to use Aluminium. However, you might prefer using self-hosted LLMs for security, privacy, or cost reasons.
Alumnium provides several options for using self-hosted LLMs:
- Serverless models on Amazon Bedrock.
- OpenAI service on Azure.
Amazon Bedrock
Alumnium supports the following models on Amazon Bedrock:
Please follow the respective documentation on how to enable access to these models on Bedrock. Once enabled, configure Alumnium to use it by exporting the following environment variables:
Azure
Alumnium supports GPT-4o Mini model on Azure OpenAI service.
Please follow the respective documentation on how to deploy the model to Azure. Once deployed, configure Alumnium to use it by exporting the following environment variables: