Configuration
Alumnium needs access to an AI model to work. The following models are supported:
| Provider | Model |
|---|---|
| Anthropic | Claude 4.5 Haiku |
| GitHub | GPT-4o Mini |
| Gemini 3 Flash | |
| OpenAI (default) | GPT-5 Nano |
| DeepSeek | DeepSeek R1 |
| Meta | Llama 4 Maverick 17B |
| MistralAI | Mistral Medium 3 |
| Ollama | Mistral Small 3.1 24B |
| xAI | Grok 4.1 Fast Reasoning |
These models were chosen because they provide the best balance between intelligence, performance, and cost. Most models now support reasoning capabilities for improved accuracy and decision-making in test automation.
Anthropic
Section titled “Anthropic”To use Anthropic as an AI provider in Alumnium:
- Get the API key.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="anthropic"export ANTHROPIC_API_KEY="sk-ant-..."GitHub
Section titled “GitHub”To use GitHub Models AI provider with OpenAI in Alumnium:
- Get the personal access token.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="github"export OPENAI_API_KEY="github_pat_..."To use Google AI Studio as an AI provider in Alumnium:
- Get the API key.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="google"export GOOGLE_API_KEY="..."OpenAI
Section titled “OpenAI”To use OpenAI as an AI provider in Alumnium:
- Get the API key.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="openai"export OPENAI_API_KEY="sk-proj-..."DeepSeek
Section titled “DeepSeek”To use DeepSeek as an AI provider in Alumnium:
- Set up a DeepSeek Platform account.
- Get the API key.
- Export the following environment variable before running tests:
export ALUMNIUM_MODEL="deepseek"export DEEPSEEK_API_KEY="sk-..."To use Meta Llama as an AI provider in Alumnium:
- Set up an Amazon Bedrock account.
- Enable access to Llama 4 Maverick models.
- Get the access key and secret.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="aws_meta"export AWS_ACCESS_KEY="..."export AWS_SECRET_KEY="..."MistralAI
Section titled “MistralAI”To use MistralAI as an AI provider in Alumnium:
- Get the API key.
- Export the following environemnt variables before running testes:
export ALUMNIUM_MODEL="mistralai"export MISTRAL_API_KEY="..."Ollama
Section titled “Ollama”To use Ollama for a fully local model inference:
- Download and install Ollama.
- Download Mistrall Small 3.1 24B model:
ollama pull mistral-small3.1:24b- Export the following environment variable before running tests:
export ALUMNIUM_MODEL="ollama"export ALUMNIUM_OLLAMA_URL="..." # if you host Ollama on a serverTo use xAI as an AI provider in Alumnium:
- Get the API key.
- Export the following environemnt variables before running testes:
export ALUMNIUM_MODEL="xai"export XAI_API_KEY="xai-..."Read next to learn how to write tests!