Configuration
Alumnium needs access to an AI model to work. The following models are supported:
Provider | Model |
---|---|
Anthropic | Claude 3 Haiku |
Gemini 2.0 Flash | |
OpenAI (default) | GPT-4o Mini |
Meta | Llama 3.2 90B |
These models were chosen because they provide the best balance between intelligence, performance, and cost. They all behave roughly the same in Alumnium tests.
Anthropic
To use Anthropic as an AI provider in Alumnium:
- Get the API key.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="anthropic"export ANTHROPIC_API_KEY="sk-ant-..."
To use Google AI Studio as an AI provider in Alumnium:
- Get the API key.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="google"export GOOGLE_API_KEY="..."
OpenAI
To use OpenAI as an AI provider in Alumnium:
- Get the API key.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="openai"export OPENAI_API_KEY="sk-proj-..."
Meta
To use Meta Llama as an AI provider in Alumnium:
- Set up an Amazon Bedrock account.
- Enable access to Llama 3.2 models.
- Get the access key and secret.
- Export the following environment variables before running tests:
export ALUMNIUM_MODEL="aws_meta"export AWS_ACCESS_KEY="..."export AWS_SECRET_KEY="..."
Read next to learn how to write tests!