Skip to content

Configuration

Alumnium needs access to an AI model to work. The following models are supported:

ProviderModel
AnthropicClaude 3 Haiku
GoogleGemini 2.0 Flash
OpenAI (default)GPT-4o Mini
MetaLlama 3.2 90B

These models were chosen because they provide the best balance between intelligence, performance, and cost. They all behave roughly the same in Alumnium tests.

Anthropic

To use Anthropic as an AI provider in Alumnium:

  1. Get the API key.
  2. Export the following environment variables before running tests:
Terminal window
export ALUMNIUM_MODEL="anthropic"
export ANTHROPIC_API_KEY="sk-ant-..."

Google

To use Google AI Studio as an AI provider in Alumnium:

  1. Get the API key.
  2. Export the following environment variables before running tests:
Terminal window
export ALUMNIUM_MODEL="google"
export GOOGLE_API_KEY="..."

OpenAI

To use OpenAI as an AI provider in Alumnium:

  1. Get the API key.
  2. Export the following environment variables before running tests:
Terminal window
export ALUMNIUM_MODEL="openai"
export OPENAI_API_KEY="sk-proj-..."

Meta

To use Meta Llama as an AI provider in Alumnium:

  1. Set up an Amazon Bedrock account.
  2. Enable access to Llama 3.2 models.
  3. Get the access key and secret.
  4. Export the following environment variables before running tests:
Terminal window
export ALUMNIUM_MODEL="aws_meta"
export AWS_ACCESS_KEY="..."
export AWS_SECRET_KEY="..."

Read next to learn how to write tests!