A AZMX AI
Documentation

From zero to a
useful exchange.

Install, add a key — or skip the internet entirely. The first useful exchange takes about a minute.

1 · Install

Grab the build for your OS from the download page. macOS is signed and notarized; Windows installs per-user with no admin; Linux ships as AppImage, .deb, and .rpm. Under 10 MB on disk, no Electron runtime, no background Chromium.

Once installed, AZMX AI keeps itself current — the auto-updater checks a public manifest on launch and verifies each update with a minisign signature before applying it.

2 · Add an AI provider (BYOK)

AZMX AI is bring-your-own-key. You hold the relationship with the model provider; the app is just the room the conversation happens in.

  1. Open Settings → AI.
  2. Pick a provider — OpenAI, Anthropic, Google, Groq, xAI, Cerebras, or any OpenAI-compatible endpoint.
  3. Paste your API key and save.
Keys are written to the OS keychain through the keyring layer — service azmx-ai. Never to disk, never to localStorage, never to the settings store, never to logs. If your OS has no keychain daemon, AZMX AI falls back to a file you control and tells you so plainly.

Charges accrue against your own provider account, on that provider's terms. AZMX AI does not proxy, meter, or see your traffic.

3 · Run fully offline

Don't want anything leaving the machine? Point AZMX AI at LM Studio (or any OpenAI-compatible local server) and the agent loop, tools, and per-hunk diff UI all work against a local model — no internet round-trip.

Settings → AI → Provider: OpenAI-compatible · Base URL: http://localhost:1234/v1

Only point AZMX AI at local endpoints you control — they are trusted at the network level.

4 · Shortcuts worth knowing

Cmd on macOS, Ctrl elsewhere.

  • ⌘K — command palette
  • ⌘; — natural language → shell command
  • ⌘D / ⌘⇧D — split panes; ⌘[ / ⌘] cycle focus
  • ⌘P — fuzzy quick-open in the file explorer
  • ⌘⇧S — SSH connection picker
  • /commit — draft a Conventional Commit from git diff
  • /macro, /index + semantic_search — parameterised macros and workspace semantic search

5 · Project memory (AZMX.md)

Drop an AZMX.md at a workspace root and AZMX AI loads it as agent memory and project configuration — the same idea as AGENTS.md / CLAUDE.md. The agent also keeps a local-first per-workspace memory tree under <workspace>/.azmx/memory/. MCP servers (stdio) can be wired in to expose Notion / Linear / GitHub tools to the agent.

6 · Support

Questions, bugs, and feature requests go to the public releases repository. Security issues follow a separate, private path — see the security policy; please don't file security reports as public issues.

Issues & requests

Bug reports, feature ideas, and questions. Public, searchable, fastest path to an answer.

Open an issue ↗

Security disclosure

Found a vulnerability? Report it privately. We respond within a few days and credit you on release.

security@azmx.ai

General contact

Licensing, press, or anything that isn't a bug.

hello@azmx.ai

Release notes

What changed, version by version.

View the changelog →