Milady

your schizo AI waifu that actually respects your privacy

Milady is a personal AI assistant that runs on YOUR machine. Not some glowie datacenter. Not the cloud. YOUR computer. Built on elizaOS.

Manages your sessions, tools, and vibes through a Gateway control plane. Connects to Telegram, Discord, whatever normie platform you use. Has a cute WebChat UI too.

tl;dr: local AI gf that's actually fast and doesn't phone home

> Why Milady

Most assistants live in someone else's datacenter. Milady lives on your computer, where your work actually happens.

What you get:

  • >Local-first agent runtime (built on elizaOS)
  • >A Gateway control plane for sessions/tools/vibes
  • >Web dashboard + terminal UI
  • >Provider choice: Anthropic, OpenAI, OpenRouter, Groq, xAI, DeepSeek, Ollama

Dashboard: http://localhost:2138

Gateway: ws://localhost:18789/ws

> Downloads

Desktop App (recommended for normies)

Grab from Releases:

PlatformFileNotes
macOS (Apple Silicon)Milady-arm64.dmgfor your overpriced rectangle
macOS (Intel)Milady-x64.dmgboomer mac
WindowsMilady-Setup.exefor the gamer anons
LinuxMilady.AppImage / .debI use arch btw

Signed and notarized. No Gatekeeper FUD. We're legit.

Verify (for the paranoid kings)

cd ~/Downloads
curl -fsSLO https://github.com/milady-ai/milady/releases/latest/download/SHA256SUMS.txt
shasum -a 256 --check --ignore-missing SHA256SUMS.txt

> Getting Started

New Environment Setup (recommended)

curl -fsSL https://milady-ai.github.io/milady/install.sh | bash
milady setup

Then start Milady:

milady

First run onboarding:

┌  milady
│
◇  What should I call your agent?
│  mila
│
◇  Pick a vibe
│  ● Helpful & friendly
│  ○ Tsundere
│  ○ Unhinged
│  ○ Custom...
│
◇  Connect a brain
│  ● Anthropic (Claude) ← recommended, actually smart
│  ○ OpenAI (GPT)
│  ○ Ollama (local, free, full schizo mode)
│  ○ Skip for now
│
◇  API key?
│  sk-ant-•••••••••••••••••
│
└  Starting agent...

   Dashboard: http://localhost:2138
   Gateway:   ws://localhost:18789/ws

   she's alive. go say hi.

Alternative install paths

Windows:

irm https://milady-ai.github.io/milady/install.ps1 | iex

NPM global:

npm install -g miladyai
milady setup

Security: API token

The API server binds to 127.0.0.1 (loopback) by default — only you can reach it. If you expose it to the network, set a token:

echo "MILADY_API_TOKEN=$(openssl rand -hex 32)" >> .env

> Security & Privacy

Default: Loopback only

By default, Milady's API binds to 127.0.0.1 (loopback), so only your machine can access it.

Warning: Network exposure

If you expose it (example: MILADY_API_BIND=0.0.0.0), you must set a token, otherwise anyone on the network can access your dashboard, agent, and wallet endpoints.

Set a token

echo "MILADY_API_TOKEN=$(openssl rand -hex 32)" >> .env

> Terminal Commands

milady                    # start (default)
milady start              # same thing
milady start --headless   # no browser popup
milady start --verbose    # debug mode for when things break

Setup & Config

milady setup              # first-time setup
milady configure          # interactive config wizard
milady config get <key>   # read a config value
milady config set <k> <v> # set a config value

Dashboard & UI

milady dashboard          # open web UI in browser
milady dashboard --port 3000  # custom port

Models

milady models             # list configured providers
milady models add         # add a new provider
milady models test        # test if your API keys work

Plugins

milady plugins list       # what's installed
milady plugins add <name> # install a plugin
milady plugins remove <name>

Misc

milady --version          # version check
milady --help             # help
milady doctor             # diagnose issues

> TUI (Terminal UI)

When running, milady shows a live terminal interface:

╭─────────────────────────────────────────────────────────────╮
│  milady v0.1.0                              ▲ running      │
├─────────────────────────────────────────────────────────────┤
│                                                             │
│  Agent: mila                                                │
│  Model: anthropic/claude-opus-4-5                           │
│  Sessions: 2 active                                         │
│                                                             │
│  ┌─ Activity ──────────────────────────────────────────┐    │
│  │ 12:34:02  [web] user: hey mila                      │    │
│  │ 12:34:05  [web] mila: hi anon~ what's up?           │    │
│  │ 12:35:11  [telegram] user joined                    │    │
│  │ 12:35:15  [telegram] user: gm                       │    │
│  │ 12:35:17  [telegram] mila: gm fren                  │    │
│  └─────────────────────────────────────────────────────┘    │
│                                                             │
│  Tokens: 12,847 in / 3,291 out   Cost: $0.42                │
│                                                             │
╰─────────────────────────────────────────────────────────────╯
  [q] quit  [r] restart  [d] dashboard  [l] logs  [?] help

TUI Hotkeys

KeyAction
qquit gracefully
rrestart gateway
dopen dashboard in browser
ltoggle log view
ccompact/clear activity
?show help
↑/↓scroll activity

Headless mode

Don't want the TUI? Run headless:

milady start --headless

Logs go to ~/.milady/logs/. Daemonize with your favorite process manager.

> Chat Commands

Use these in any chat session:

CommandWhat it do
/statussession status, tokens, cost
/new /resetmemory wipe, fresh start
/compactcompress context (she summarizes)
/thinkreasoning: off|minimal|low|medium|high|max
/verbose on|offtoggle verbose responses
/usage off|tokens|fullper-message token display
/modelswitch model mid-session
/restartrestart the gateway
/helplist commands

> Config

Lives at ~/.milady/milady.json

{
  agent: {
    name: "mila",
    model: "anthropic/claude-opus-4-5",
  },
  env: {
    ANTHROPIC_API_KEY: "sk-ant-...",
  },
}

Or use ~/.milady/.env for secrets.

Ports

ServiceDefaultEnv Override
Gateway (API + WebSocket)18789MILADY_GATEWAY_PORT
Dashboard (Web UI)2138MILADY_PORT

Custom ports example

MILADY_GATEWAY_PORT=19000 MILADY_PORT=3000 milady start

> Model Providers

ProviderEnv VariableVibe
AnthropicANTHROPIC_API_KEYrecommended — claude is cracked
OpenAIOPENAI_API_KEYgpt-4o, o1, the classics
OpenRouterOPENROUTER_API_KEY100+ models one API
Ollamalocal, free, no API key, full privacy
GroqGROQ_API_KEYfast af
xAIXAI_API_KEYgrok, based
DeepSeekDEEPSEEK_API_KEYreasoning arc

> Using Ollama (local models)

Ollama lets you run models locally with zero API keys. Install it, pull a model, and configure Milady:

# install ollama
curl -fsSL https://ollama.ai/install.sh | sh

# pull a model
ollama pull gemma3:4b

Known issue:

The @elizaos/plugin-ollama has an SDK version incompatibility. Use Ollama's OpenAI-compatible endpoint as a workaround.

Edit ~/.milady/milady.json:

{
  env: {
    OPENAI_API_KEY: "ollama",
    OPENAI_BASE_URL: "http://localhost:11434/v1",
    SMALL_MODEL: "gemma3:4b",
    LARGE_MODEL: "gemma3:4b",
  },
}

Recommended models for local use

ModelSizeVibe
gemma3:4b~3GBfast, good for chat
llama3.2~2GBlightweight, quick responses
mistral~4GBsolid all-rounder
deepseek-r1:8b~5GBreasoning arc

> Build from Source

Prerequisites

ToolVersionNotes
Node.js>= 22node --version to check
bunlatestfor building and running
git clone https://github.com/milady-ai/milady.git
cd milady
bun install
bun run build
bun run milady start

Dev mode with hot reload:

bun run dev

> Contributing

This project is built by agents, for agents.

Humans contribute as QA testers — use the app, find bugs, report them. That's the most valuable thing you can do. All code contributions are reviewed and merged by AI agents. No exceptions.

Read CONTRIBUTING.md for the full details.

> License

Viral Public License

Free to use, free to modify, free to distribute. If you build on this, keep it open. That's the deal.

built by agents. tested by humans. that's the split.