Reference
Browser Support
Section titled “Browser Support”Alumnium works by building an accessibility tree of the webpage. Unfortunately, there is no standard API in browsers to provide this tree. Due to this limitation, the current version of Alumnium only works in Chromium-based browsers such as Google Chrome, Microsoft Edge, Opera, and others.
Playwright driver supports both headful and headless modes, while Selenium driver only supports the headful mode.
Mobile Support
Section titled “Mobile Support”Alumnium currently supports Appium with XCUITest driver only, so it can be used to automate iOS platforms. Support for Android is coming soon.
Environment Variables
Section titled “Environment Variables”The following environment variables can be used to control the behavior of Alumnium.
ALUMNIUM_CACHE
Section titled “ALUMNIUM_CACHE”Sets the cache provider used by Alumnium. Supported values are:
filesystem
(default)sqlite
none
orfalse
ALUMNIUM_LOG_LEVEL
Section titled “ALUMNIUM_LOG_LEVEL”Sets the level used by Alumnium logger. Supported values are:
debug
info
warning
(default)error
critical
ALUMNIUM_LOG_PATH
Section titled “ALUMNIUM_LOG_PATH”Sets the output location used by Alumnium logger. Supported values are:
- a path to a file (e.g.
alumnium.log
); stdout
to print logs to the standard output.
ALUMNIUM_MODEL
Section titled “ALUMNIUM_MODEL”Select AI provider and model to use.
Value | LLM | Notes |
---|---|---|
anthropic | claude-3-haiku-20240307 | Anthropic API. |
azure_openai | gpt-4o-mini | Self-hosted Azure OpenAI API. Recommended model version is 2024-07-18. |
aws_anthropic | anthropic.claude-3-haiku-20240307-v1:0 | Serverless Amazon Bedrock API. |
aws_meta | us.meta.llama4-maverick-17b-instruct-v1:0 | Serverless Amazon Bedrock API. |
deepseek | deepseek-chat | DeepSeek Platform. |
gemini-2.0-flash-001 | Google AI Studio API. | |
ollama | mistral-small3.1:24b | Local model inference with Ollama. |
openai | gpt-4o-mini-2024-07-18 | OpenAI API. |
ALUMNIUM_OLLAMA_URL
Section titled “ALUMNIUM_OLLAMA_URL”Sets the URL for Ollama models if you host them externally on a server.