Configuration
Alumnium needs access to an AI model to work. The following models are supported:
Provider | Model |
---|---|
Anthropic | Claude 3 Haiku |
Gemini 2.0 Flash | |
OpenAI (default) | GPT-4o Mini |
DeepSeek | DeepSeek V3 |
Meta | Llama 3.2 90B |
Ollama | Mistral Small 3.1 24B |
These models were chosen because they provide the best balance between intelligence, performance, and cost. They all behave roughly the same in Alumnium tests.
Anthropic
Section titled “Anthropic”To use Anthropic as an AI provider in Alumnium:
- Get the API key.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="anthropic"export ANTHROPIC_API_KEY="sk-ant-..."
To use Google AI Studio as an AI provider in Alumnium:
- Get the API key.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="google"export GOOGLE_API_KEY="..."
OpenAI
Section titled “OpenAI”To use OpenAI as an AI provider in Alumnium:
- Get the API key.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="openai"export OPENAI_API_KEY="sk-proj-..."
DeepSeek
Section titled “DeepSeek”To use DeepSeek as an AI provider in Alumnium:
- Set up a DeepSeek Platform account.
- Get the API key.
- Export the following environment variable before running tests:
export ALUMNIUM_MODEL="deepseek"
To use Meta Llama as an AI provider in Alumnium:
- Set up an Amazon Bedrock account.
- Enable access to Llama 3.2 models.
- Get the access key and secret.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="aws_meta"export AWS_ACCESS_KEY="..."export AWS_SECRET_KEY="..."
Ollama
Section titled “Ollama”To use Ollama for a fully local model inference:
- Download and install Ollama.
- Download Mistrall Small 3.1 24B model:
ollama pull mistral-small3.1:24b
- Export the following environment variable before running tests:
export ALUMNIUM_MODEL="ollama"
Read next to learn how to write tests!