Skip to main content

Documentation Index

Fetch the complete documentation index at: https://septemberai.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

The Engine runs inside Docker, so the host requirements are minimal. You need:

Required

  • Docker with Compose v2. Versions tested:
    • Docker Desktop 4.x (macOS, Windows).
    • Docker CE 24+ (Linux).
  • Git for cloning the repo.
  • An LLM provider API key. Pick one:
    • Anthropic — sk-ant-...
    • OpenAI — sk-...
    • Google Gemini — Google API key with the Generative Language API enabled.
  • An OpenAI API key for embeddings. This is required regardless of which chat provider you choose. The Engine uses OpenAI’s text-embedding-3-small for memory search.

Optional

  • sqlite3 CLI if you want to inspect the brain database from the host.
  • jq for pretty-printing the JSON streams in your terminal.
  • An MCP server (Slack, Gmail, etc.) if you want to test external connectors. The Engine itself does not require any.

Disk and memory

  • Disk: the Engine image is ~1.2 GB. Brain databases start at a few MB and grow with use; budget 1 GB per active user as a comfortable upper bound for a single-user development setup.
  • RAM: the Engine process is light (~200 MB resident). The bottleneck is whatever the LLM call returns.

Architecture

Tested on:
  • macOS arm64 (Apple Silicon).
  • Linux x86_64.
If you’re on Apple Silicon, docker-compose.yml already pins the test container to linux/amd64 because some sandbox tests require x86 syscall behavior. You don’t need to do anything special — it just works.

Network

Outbound HTTPS to your chosen LLM provider must be reachable. If you’re behind a corporate proxy:
  • Set HTTPS_PROXY and HTTP_PROXY in your shell before running docker compose up.
  • Make sure those envs are passed into the container by the compose file (the default compose file does this).
You should not need any inbound port forwarding for local development — everything is served on localhost:8000.

Confirm everything is in place

docker --version
docker compose version
git --version
echo "$LLM_API_KEY" | head -c 10  # should print the first 10 chars of your key
echo "$OPENAI_API_KEY" | head -c 10
If those all return reasonable output, head over to Quickstart.